Skip to main content

Predicting evidence-based treatment sustainment: results from a longitudinal study of the Adolescent-Community Reinforcement Approach

Abstract

Background

Implementation support models are increasingly being used to enhance the delivery of evidence-based treatments (EBTs) in routine care settings. Little is known about the extent to which these models lead to continued EBT use after implementation support ends. Moreover, few empirical studies longitudinally examine the hypothesized factors associated with long-term psychosocial EBT use (i.e., sustainment). In an effort to address this gap, the current study examined sustainment of an EBT called the Adolescent-Community Reinforcement Approach (A-CRA) following the end of implementation support.

Methods

Between 2006 and 2010, the Substance Abuse and Mental Health Services Administration awarded 3 years of A-CRA implementation support to 82 community-based organizations around the USA. The extent to which A-CRA was sustained following grant end and the hypothesized factors associated with EBT sustainment were collected using both retrospective and prospective data. We examined the extent to which 10 core treatment elements of A-CRA were sustained and the associations between the extent of A-CRA sustainment and hypothesized factors using a pattern-mixture longitudinal modeling approach.

Results

Staff from 76 organizations participated in data collection for a 92.86% response rate. On average, about half of the 10 core treatment elements were sustained following the loss of implementation support. Factors that appeared most important to A-CRA sustainment included characteristics that were related to the outer setting (communication, funding, and partnerships), inner setting (political support, organizational capacity, and supervisor turnover rate), implementation support period (number of clinicians and supervisors certified and employed at support end and number of youth served), and staff perceptions of the intervention (implementation difficulty, relative advantage, and perceived success).

Conclusions

Even with multiple years of implementation support, community-based organizations face challenges in sustaining EBT delivery over time. Consistent with implementation theories, multiple factors appear related with EBT sustainment, including the degree of implementation during the initial support period, as well as adequate funding, infrastructure support, and staff support following the end of funding.

Peer Review reports

Background

Research suggests many evidence-based treatments (EBTs) are discontinued or rarely used following the end of initial implementation support [1, 2]. Moreover, the previous literature offers limited empirical evidence about the factors related to psychosocial therapy program sustainment in routine substance use disorder treatment settings [3]. Without effective EBT sustainment, investments in implementation are spent unwisely and public health impact is limited [4, 5]. Thus, there is a need to characterize the extent of EBT sustainment following the end of implementation support and to identify factors that are predictive of sustainment as they may serve as targets for sustainment strategies.

Previous literature

In the past decade, there has been a growing body of evidence on measuring EBT sustainment and the hypothesized predictors [6]. However, recent reviews suggest that research within the health care field is underdeveloped [7, 8]. One issue that has resulted from this work is the diversity of definitions regarding how to assess program sustainment. Generally, it has been assessed by a simple self-reported dichotomous indicator of whether a program is still in operation or not following an initial funding period [6]. Although a dichotomous variable of whether a treatment was delivered or not may be appropriate for medication trials or to understand utilization rates (e.g., [9]), findings from systematic reviews emphasize the need to better capture the extent or quality of sustainment because partial sustainment may not lead to the same desired outcomes observed in research trials, especially in relation to more complex treatments such as psychotherapy. Moreover, the research to date suggests that “partial” sustainment over full sustainment is commonly found after implementation support ends [7, 8, 10].

Systematic literature reviews also suggest that program sustainment is an intricate process, and the determinants or predictors of sustainment are mixed and vary across health care domains. As this study was conceptualized, two major implementation frameworks were published, the Exploration, Practice, Implementation, and Sustainment model (EPIS; [11]) and the Consolidated Framework For Implementation Research (CFIR; [12]). These frameworks both suggest that characteristics related to the broader community environment, external to the organization (i.e., “outer” setting or context), and characteristics operating within the organization or “inner” setting or context, such as organizational leadership, climate and culture, adequate staffing, and organizational processes, that help support EBT implementation such as strategic planning are relevant. In addition, characteristics of individuals charged with implementing the intervention and characteristics of the intervention itself are important. In sum, these conceptual frameworks suggest the need to take into account multiple factors operating at the system, organization, and treatment provider level to understand EBT adoption, implementation, and sustainment [13].

Taking into account these conceptual framework perspectives, findings from recent empirical studies have been inconsistent [1] and limited. For example, the use of small samples are common (e.g., prospective data from 32 programs [14]; 11 systems operating in two states [15]), studies that have been limited to the examination of demonstration projects (e.g., [10]) rather than programs operated by established organizations, and some studies that have only examined a subset of hypothesized factors, such as inner context characteristics (e.g., [5, 16]) and few rigorous studies have been conducted that use longitudinal data and measure the quality or extent of EBT sustainment. Therefore, a continued need exists for comprehensive empirical studies assessing the extent or quality of sustainment and the numerous hypothesized factors that predict it. Moreover, there is also little information about EBT sustainment in relation to psychosocial treatments for substance use which is typically provided in community-based settings that are often low or under-resourced and experience high staff turnover [17,18,19]. The current study helps address these gaps by examining the extent to which an EBT is sustained and what implementation factors, including a comprehensive set of potential factors that span both the inner and outer setting along with implementation- and intervention-related characteristics, are associated with sustainment.

Current study

Substance use disorder treatment and implementation support are largely funded in the USA through public sources [20], such as discretionary grants that enhance organizational capacity to utilize EBTs. In this study, we investigated one of the largest investments to date by the Substance Abuse and Mental Health Services Administration’s (SAMHSA) Center for Substance Abuse Treatment (CSAT) to address adolescent substance use disorders, by providing multiple year support to community-based organizations to implement the Adolescent-Community Reinforcement Approach (A-CRA). A-CRA has been found in five randomized controlled studies to yield positive outcomes across alcohol use, mental health, and social domains [21,22,23,24,25], and it is listed in SAMHSA’s National Registry of Evidence-based Programs and Practices (N-REPP). A-CRA supports youth and parents by enhancing family, social, educational, and/or vocational reinforcers to aid in recovery from substance use. As of January 2016, over 270 organizations received support to deliver A-CRA [26].

A-CRA implementation support model

The discretionary grants that treatment organizations received from SAMHSA during its initial dissemination efforts included approximately $300,000 annually for 3 years, plus three and half days of in-person A-CRA training for up to five clinical supervisors and clinicians followed by ongoing telephone coaching calls. The training processes were designed based on effective methods for disseminating EBTs [27, 28] and are described in greater detail in Godley et al. [6]. For example, individual clinicians received numeric and narrative feedback specific to reviews of recorded therapy sessions based on a detailed coding manual from external A-CRA experts [29] until they demonstrated competence and then randomly thereafter. A supervisor training process was designed to train supervisors to become internal A-CRA experts and included requirements that supervisors were able to reliably rate recorded therapy sessions and conduct regular internal supervision.

A-CRA sustainment

As with many EBT dissemination efforts, prior to the current study little was known about how effective the A-CRA implementation support effort was in promoting long-term use of A-CRA. We operationalized sustainment by assessing the extent to which core treatment elements were maintained following the end of the implementation support period and adequate organizational capacity to continue maintenance of these core elements was demonstrated. This definition is consistent with the recommendations from a recent systematic review [8], and this operationalization was also recently employed by Aarons and colleagues [15, 30]. The elements align with the definitions of implementation fidelity (e.g., [31, 32]) in that they included measurement of treatment dose, quality (as measured by staff knowledge, certification, access to high-quality training, and supervision), and penetration (i.e., proportion of staff trained and youth treated in A-CRA).

We attempted to capture factors related to external and internal support for A-CRA implementation and take into account assessments of the extent or quality of implementation during the funding period to capture how well organizations had built the capacity to continue to deliver A-CRA following the loss of implementation support. This approach is consistent with implementation theories that suggest that program sustainment only can occur within an organization that has first committed to adopting and implementing a treatment or practice [11]. In sum, the purpose of this study was to examine the extent of A-CRA sustainment following loss of federal funding among community-based organizations and identify what hypothesized implementation factors were empirically related to sustainment.

Methods

Study context

This project examined A-CRA sustainment among 82 community-based organizations that were awarded an implementation support grant by SAMHSA/CSAT between 2006 and 2010. During that period, there were four program cohorts called the “Assertive Adolescent Family Treatment” initiatives that encouraged the use of A-CRA as the treatment approach (e.g., [33]). In addition, other SAMHSA funding opportunities were offered during this period including the “Juvenile Drug Court” and “Juvenile Drug Treatment Court,” the “Offender Reentry Project,” and the “Targeted Capacity Expansion” initiatives. For these initiatives, the community-based grantee was required to identify an EBT, and several of the funded organizations selected A-CRA and therefore were included in our study sample.

Sample

Organizations

The study sample was composed of established treatment organizations located in the USA. To receive the implementation support grant, applicants were required to demonstrate substance use treatment operation for at least 2 years prior to the proposed project period and compliance with local licensing, accreditation, and certification requirements.

Staff

Clinical supervisors and clinicians from the grantee organizations were recruited to participate. We aimed to enroll at least two individuals from each program responsible for youth treatment who were familiar with A-CRA. Because clinical supervisors and clinicians were trained and certified at different levels as part of the implementation process, it was important to include them both in the study sample.

Data sources

We used administrative data collected from organizations during their respective funding period to assess the implementation-related factors. We also collected three waves of interview and survey data approximately 9 months apart starting in Fall 2013 and ending in Spring 2015.

Primary data collection procedures

We used an administrative dataset containing the contact information of staff who were trained during the 3-year implementation period. Consistent with effective tailored survey methods [34], the recruitment strategy employed multiple methods (i.e., mail, phone, and email) to introduce and remind potential participants about the study. Once the phone interviews were completed, participants were sent an email with instructions on how to access an online survey.

The staff from the grantee organizations were re-contacted approximately 9 months following their first interview (and 9 months after the second interview for the third wave of interviews) to complete another interview and survey describing their experience with delivering A-CRA. If a participant reported that their organization was no longer delivering A-CRA at the proceeding wave, the organizations’ status regarding A-CRA discontinuation was confirmed at the following interview wave.

A-CRA sustainment

Overview

Based on A-CRA implementation indicators that were monitored during the implementation support period, we assessed A-CRA sustainment by examining how well each organization maintained the 10 core treatment elements (see Table 1 for details on the measurement and scoring of each element). As shown in this Table under the “Source” column, the elements were assessed using primarily information collected from interviews or surveys conducted with clinical supervisors and/or clinicians at each of the participating organizations. Regarding the clinical and supervisor certification elements, self-reported information was verified using training support records from the team responsible for A-CRA certification. Also, training agendas were collected and reviewed for appropriate content by A-CRA experts.

Table 1 Ten core treatment elements of A-CRA sustainment

Sustainment period and self-reported status

We used administrative data about when the organization stopped receiving the last SAMHSA/CSAT funding (i.e., grant end date) to calculate the time between the funding end date and the study data collection dates. We utilized the number of months since the last grant end date to characterize the time since funding loss and used it as a control variable in the main regression analyses.

Participants were asked at the beginning of the interview whether their organization was currently delivering A-CRA and if the organization had stopped using A-CRA, to report when it had stopped being delivered. A dichotomous indicator of whether the organization was sustaining A-CRA or not was calculated and used to address missing data (see Analytic Plan).

Implementation characteristics

Outer setting characteristics

We used several subscales of the Program Sustainability Assessment Tool (PSAT) [35, 36] to assess outer and inner setting factors. Each subscale consisted of five items and the alphas among our sample ranged from 0.84 to 0.95. Responses ranged from “to little or no extent” (scored as a “1”) to “to a great extent” (scored as a “7”), we summed the responses (for a range of 5–35) and calculated the mean values for each scale. The following subscales were used: communications to capture strategic communication with stakeholders and the public, funding stability to assess whether staff reported that the A-CRA program had a consistent and stable funding source, partnerships to measure community support for the program and political support to gauge political environments (e.g., “Political champions advocate for the program”). Of note, some of subscales assess both outer and internal contextual characteristics or are ambiguous in regard to setting, but we classified them here for simplicity.

Inner setting characteristics

Structural factors: We examined the comprehensiveness of the organizations, by evaluating a count of services offered at the organization using a survey question from the National Survey of Substance Abuse Treatment Services [37]. The range on this variable was from 0 to 17. We also asked what the primary focus of the organization was with the following response options: substance use treatment services, mental health services, a mix of mental health and substance use services (neither is primary), general health care, and others. A binary variable was used that indicated whether staff reported that substance use treatment was the primary service (coded “1”) as compared to all other options (coded “0”). Staffing characteristics: We assessed current staffing characteristics that may help explain the organization’s capacity to sustain A-CRA. More specifically, we calculated clinical supervisor and clinician turnover rates based on interview questions with supervisors for the past 6 months. Given that treatment implementation may be influenced by the number of staff available, we also include a variable of the client-to-staff ratio that was calculated based on the supervisor reports of the number of youth served and number of clinicians serving youth in the past 6 months. Staff attributes: The Organizational Readiness for Change (ORC) scale [38] operationalizes staff attributes using the following subscales: adaptability, efficacy, and influence. We also asked both supervisors and clinicians about organizational climate using items from the ORC scale [38] that included the following subdomains: autonomy, cohesion, communication, mission, and stress. Responses from both ORC scales were scored on a five-point scale from “strongly disagree” (coded as a “1”) to “strongly agree” (coded as a “5”). Responses were summed and averaged across each of the subscales in the organizational climate domain, and then, the subscale values were averaged and multiplied by 10 to create a value between 10 and 50. Implementation leadership: We included the 12-item version of the Implementation Leadership Scale [39]. Clinician responses were recorded on a five-point scale from 1 indicating “strongly disagree” to 5 indicating “strongly agree” then summed and averaged. We assessed organizational resources and support to effectively manage A-CRA using the organizational capacity subscale of the PSAT that was asked of both clinicians and supervisors. Strategic planning for sustainability was assessed using the corresponding PSAT subscale.

Implementation-related characteristics

The extent to which an organization has consistently implemented the intervention during the funding period is likely to influence how well it can be sustained post the initial support period [11, 40]. In order to examine this, we utilized the data collected during the funding period that included (1) the number of adolescents that received A-CRA, (2) the number of clinical supervisors certified in A-CRA and still employed at each organization at grant end, and (3) the number of clinicians certified in A-CRA at each organization and still employed at the end of the grant period.

Perceptions of the intervention

Several theories suggest that perceptions of a particular innovation’s ease of use and benefit over alternative options will influence its adoption, use, and presumably longer-term sustainment [11, 12, 41, 42]. We included assessments of staff perceptions of A-CRA’s complexity and relative advantage from Steckler et al. [43] (alphas = 0.88 and 0.83, respectively). We included staff perceptions of implementation difficulty and perceived success using five-item scales developed from O’Loughlin et al. [44] (alphas = 0.57 and 0.91, respectively). All of these survey items had response options on a five-point scale ranging from “strongly disagree” (scored as a “1”) to “strongly agree” (scored as a “5”) for a summed score range of 4–20 (for the relative advantage scale) or 5–25 (for the complexity, difficulty, and success scales).

Analytic plan

First, data collected from staff was aggregated to the organization by wave level. The hypothesized outer setting, inner setting, and intervention-related variables were constructed using assessments collected from what we considered the “critical period”, that is, the first 18 months following the end of the implementation support period.

Prior to the longitudinal analyses, each of the 10 elements was standardized to 0–1 and summed for a score ranging from 0 to 10 at the site by wave level. Missing data in elements was imputed by using the means of observed data in each study wave and by self-reported sustainment status. The summed score is our main sustainment outcome in statistical analysis.

As planned [45], we conducted exploratory analysis to find potential clustering of sustainment trajectories. The organizations were grouped into two sustainment patterns according to the cluster analysis. The first pattern is consisted of organizations which reported stopping A-CRA delivery within 12 months at the end of the funding period. The second pattern is consisted of organizations which reported sustaining A-CRA longer than 12 months in the post-funding period (i.e., either stopped A-CRA after the 12th month in the post-funding period or continued sustaining A-CRA without observed stoppage). Since organizations in the first pattern had a shorter observation period (no more than two waves) than organizations in the second pattern, it is necessary to adjust for the mixing patterns in order to unbiasedly estimate the effects of predictors [46, 47]. Lastly, we fitted a set of pattern-mixture longitudinal models to estimate the marginal relationship between the sustainment outcome and each predictor separately. In these pattern-mixture models, we also controlled for time after the funding ended (in months). We used random effects to account for intra-site correlations among the longitudinal measures of sustainment scores. The mixed-effect model also allows for unsynchronized measurement times so that we could consistently estimate the marginal association. To account for multiple comparisons, we applied the step-up methods to adjust p values to control the false discovery rate at the .05 [48].

Results

Sample characteristics

One hundred and sixty-nine respondents (i.e., 33% who classified themselves as supervisors, 41% who classified themselves as clinicians, and 26% who classified themselves as both supervisions and clinicians) from 78 organizations participated in the data collection for a 92.86% response rate at the site level. The range in time since funding loss ranged from 1 to 63 months for the 78 organizations. Table 2 shows the distribution of organizations by the different funding sources. While most organizations received one grant (n = 63), 15 organizations received multiple rounds of funding, including four organizations who received three or more awards. Most of these multi-funded sites (i.e., 12 of the 15 organizations) were in the second mixture pattern (i.e., demonstrating sustainment longer than 12 months in the post-funding period).

Table 2 The number of participating organizations by the different funding mechanisms

The average age of participants was 41.74 years (SD = 11.57). Sixty-five percent of the sample was female. Thirty-three percent of the sample reported that they were of Hispanic origin. The racial composition reported by respondents was 70% White; 13% Black/African-American; 5% Native American or Alaskan; 2% Asian, Native Hawaiian, or other Pacific Islander; 1% more than one race; and 10% did not endorse a race. Sixty-five percent of the sample reported a master’s level of education, 22% reported a bachelor’s level education, 5% reported an associate’s level of education, 4% reported a doctoral degree, and 4% reported some college. The average number years of substance use disorder counseling experience among the sample was 9.44 (SD = 7.84).

Extent of A-CRA sustainment

Descriptive statistics for the 10 elements that comprise the sustainment outcome are reported in Table 3. Although clinical knowledge was generally good (i.e., overall percentage correct was 70%) and most organizations (77%) had a certified A-CRA supervisor on the staff, many of the other elements were not sustained at recommended levels. For example, less than half of the clinicians on the staff were A-CRA certified and less than half of the organizations reported delivering the recommended dosage of A-CRA. Consequently, on average, about 40% of adolescents were receiving A-CRA at these organizations. Elements related to clinical supervision, including frequency and content were lower than recommended. Supervisor knowledge of the A-CRA training and certification process was fairly low (30%), and onsite training quality was especially low with on average only 15% of the desired features included in the training agendas. Organizations that sustained A-CRA more than 12 months had higher values on 7 of the 10 elements. Elements related to clinical supervision frequency and content were similar or higher in organizations in the sustaining a year or less group.

Table 3 Descriptive statistics of the 10 core A-CRA elements

Factors associated with A-CRA sustainment

Mean values or proportions on the hypothesized factors and the sustainment outcome are presented in Table 4. The table shows the distributions by the two groupings, organizations that sustained A-CRA less than 12 months and organizations that sustained A-CRA 12-months or longer. In general, these mean values and proportions show that organizations with longer periods of sustainment have more support in terms of the outer and inner setting and implementation- and intervention-related characteristics.

Table 4 Descriptive statistics on the hypothesized factors and outcome by sustainment time

Table 5 presents the results from the marginal regression analyses including coefficients and p values accounting for a false discovery rate at the level of .05. The results show that higher ratings on the PSAT scales related to communications, funding stability, partnerships, political support (i.e., primarily external setting factors), and organizational capacity as well as strategic planning (i.e., internal setting factors) were associated with greater levels of A-CRA sustainment. The results also show that higher rates of clinical supervisor turnover were associated with lower levels of A-CRA sustainment. Performance during the implementation support period, as assessed by the number of youth who received A-CRA and number of clinical supervisors and clinicians that had been certified and were still employed at the organization at the end of the implementation support period were strongly associated with the extent to which A-CRA was sustained. Finally, perceptions of A-CRA by clinical staff were also associated with the extent to which A-CRA was sustained. Staff who perceived A-CRA as rather easy to implement were more likely to be employed at sites that sustained higher levels of A-CRA. Also, perceptions that A-CRA was an efficacious treatment and better than other available treatments were associated with organizations sustaining higher levels of A-CRA sustainment.

Table 5 Results from the marginal regression analyses predicting the extent of sustainment

Discussion

In this study, we explored the extent to which an evidence-based treatment to address adolescent substance use (i.e., A-CRA) was sustained following the end of federally funded implementation support grants to 78 community-based organizations. Using recent recommended approaches, we operationalized A-CRA sustainment by considering the extent to which organizations maintained core treatment elements following the end of the implementation support period and adequate organizational capacity to continue maintenance of these core elements was demonstrated. Our results showed that following the loss of implementation support, partial sustainment of the core elements was frequently found. These findings are comparable to previous studies that have documented that the quality of program delivery tends to decline after implementation support ends (e.g., [10, 15, 49]).

Examining the scores on the 10 core elements helps to explain the capacities that may facilitate and hinder organizations to continue A-CRA. It seems that organizations that continue A-CRA beyond 12 months were more likely to have a certified A-CRA clinical supervisor on staff and a higher proportion of A-CRA-certified clinicians than organizations with shorter sustainment periods. However, staff from organizations that reported sustaining A-CRA for a year or less reported higher scores on clinical supervision frequency and content elements compared to organizations that reported sustaining A-CRA for longer periods. This pattern suggests that organizations that maintain A-CRA delivery may be doing so with lower supervision quality which may have implications for treatment outcomes. The results also suggest that organizations that maintain higher clinical supervision intensity that is aligned with the A-CRA content are less likely to continue A-CRA than organizations that employ supervision with lower intensity and aligned content.

Our findings are in line with others which have revealed that “partial” sustainment over full sustainment is commonly found after implementation support ends [7, 8, 10]. It is the first study of its kind to examine sustainment for an adolescent substance use EBT, so it is not possible to judge whether the level of sustainment is better or worse for this type of intervention. Additionally, the measures used for sustainment are unique to the EBT. For example, Bond et al. [1] reported a 96% sustainment rate of an Individual Placement and Support (IPS) learning community approach based on an operational definition of sustainment that included whether or not a program continued to employ staff, maintained an active client caseload, provided direct services, and adhered to core principles. The latter rate is impressive, but assessing adherence to a set of core principles appears to be a more generalized measure of sustainment than adherence to very specific treatment procedures. Since EBPs have different characteristics, it is important as the study of sustainment increases to make comparisons across similar types of programs or treatments. Further work is needed to develop more standardized measures of sustainment. Differences between the sustainment rates reported here and those in the Bond et al. study may also be due to a broader availability of funding for the IPS services which included state and county agencies rather than one-time federal grants to community-based organizations. Such organizations may embrace the opportunity for federal funding so that they can hire additional staff and serve more individuals even if they know it will be difficult to sustain a given practice after the time limited funding ends. Sustainment of A-CRA or other EBTs that address substance use may improve with increased support, including reimbursement for both treatment delivery and clinical supervision to aid in the achievement of the outcomes found in the research studies where adequate time for quality clinical supervision is provided.

When we examined what factors were associated with the extent of A-CRA sustainment, we found that, consistent with implementation frameworks, characteristics of both the outer and inner settings were related to sustainment. Also, how well the organization was positioned to sustain A-CRA at the end of the implementation support period, based on their performance and capacity at grant end, was also found to be a strong predictor of the extent of sustainment. Finally, staff perceptions of A-CRA were also found to be strongly associated with the extent of sustainment.

These findings are consistent with previous theories that suggest that multiple factors including those that are external to the organization as well as within an organization are key to EBT sustainment [12, 41, 50]. Our study findings are also relatively consistent with the emerging empirical literature on EBT sustainment. For example, Hodge et al. [5] examined factors related to EBT sustainment among a large sample of “Triple P” (i.e., a parenting prevention program) providers. Similarly, they found the importance of organization supports (e.g., clinical supervision) and intervention-related characteristics (e.g., perceptions of the implementation ease). However, their study was limited in that external contextual factors were not examined. Peterson et al. [16] also examined the inner context or setting factors related to the sustainment of five mental health EBTs over an 8-year period. Unlike our study, they did not find implementation quality predictive of long-term sustainment; however, like our study, staff turnover appeared important. Tibbits et al. [14] examined the sustainment of crime and delinquency programming in school settings up to 3 years post-funding. The investigators found that leadership support, overall school support, adequate staffing, financial stability planning, and aligning the intervention with the setting (i.e., “fit”) was related to self-reported sustainment among a small sample of programs.

The study findings build on the results reported from a previous study of a subset of the same organizations (n = 68) using a binary measure of self-reported A-CRA sustainment [51]. In the previous study, we also found support for the association between external setting, inner setting, implementation-related and intervention-related factors, and A-CRA sustainment. However, a few variables that were found related to A-CRA sustainment in that study were not found to be significantly associated with the extent of sustainment found in this study (and vice versa). More specifically, in the previous study, the type of agency and perceptions of A-CRA’s complexity appeared important and such external setting factors as communications, partnerships, and internal factors such as organizational capacity were not found to be statistically significant. A key difference between the two studies is in the outcome: the previous study examined whether staff reported A-CRA was continued at the site and the current study uses the extent to which A-CRA was delivered using a more comprehensive measure of the extent of sustainment. These distinctions are important because continuing A-CRA without regard to the core elements may not lead to the same quality in delivery and thus outcomes achieved.

In sum, this study demonstrates that many characteristics cited by implementation theories or frameworks are critical to the quality in which an EBT for adolescent substance use disorders is supported in community-based substance use treatment organizations after implementation support ends. These findings point to the relevance of paying close attention to both internal and external supports to an organization to assist with EBT sustainment. Internal supports include maintaining qualified clinicians and supervisors for psychosocial treatments. External supports include the availability of other sources of funding and support in the external environment when implementation funding is ended.

It is also relevant to note that many theorized constructs were not found to be statistically significantly related to A-CRA sustainment, including clinical turnover rate, perceptions of leadership support, staff attributes, perceptions of A-CRA complexity, and characteristics of the organization including structural factors (focus, comprehensiveness), and climate. We think that clinician turnover rate may have been buffered by the support provided to supervisors to train and certify locally. In support of this, supervisor turnover was significantly related to sustainment. It appeared that external factors and how well the organization was prepared at grant end as demonstrated by certified staff and experience treating youth, along with organizational factors such as capacity, political support, and strategic planning were more critical to sustaining A-CRA than these other inner setting characteristics. Thus, the size or mission of the organization is less important than ongoing funding and support outside the agency coupled with well-trained staff that perceive the intervention positively.

We address several limitations to previous research on health care program sustainment. For example, few studies have used conceptual implementation frameworks to inform their work. This study was developed taking into account several existing conceptual approaches to program implementation and sustainment, and therefore, we assessed factors both external to and internal to the organization along with intervention-specific components to sustainment. Also, few studies have employed longitudinal, prospective methods. Given the extensive data collected during the implementation period and the longitudinal nature of this study, we were able to examine several variables prospectively to predict the extent of A-CRA sustainment. Moreover, we assessed the extent of sustainment by assessing the level of 10 core treatment elements rather than relying on a dichotomous measure of whether staff reported that their organization continued to deliver A-CRA. In sum, this study addresses many important gaps in previous research.

Limitations

Some limitations to this study are important to acknowledge. First, this study has a relatively small sample. We targeted the entire population of organizations that were funded and achieved over a 90% response rate; however, to study the multitude of hypothesized factors and the potential interaction effects, a larger sample would be required; this is a commonly noted challenge in implementation research where the main analyses are often conducted at the organizational level rather than at the patient level [32]. Due to the sample size, we were unable to reliably estimate the effects using all of the hypothesized predictors in one model; therefore, our findings cannot address questions about the relative impact of the different hypothesized predictors compared to one another on the extent of sustainment. For example, we cannot address whether external factors are more or less important than staff experience with or perceptions of A-CRA. Also relevant, the CFIR contains 39 domains, and we did not examine all of these. Moreover, we were not able to examine clinical program outcomes and whether the extent of sustainment limits the impact of A-CRA on youth substance use and related outcomes. We also did not verify some of the A-CRA elements (e.g., frequency and content of supervision); staff may have endorsed higher levels than provided on some of these activities. Future research is warranted on whether varying levels of A-CRA sustainment observed following funding loss influences the effectiveness of A-CRA.

Conclusions

Despite the recent development of several evidence-based treatment (EBT) programs for substance use, less than half of adolescents are positively discharged from treatment [52], suggesting the need for the practice of effective treatments. Although the A-CRA has demonstrated effectiveness, we found that following an implementation support period, community-based substance use treatment organizations found longer term sustainment challenging. Successful implementation during the initial funding period appeared an important factor to longer term sustainment. This finding demonstrates the need to closely monitor and support implementation during the funding phase. Other important external factors were funding stability and community and political support for the treatment. Critical organizational factors included adequate supervisor staffing, organizational capacities, and positive perceptions about the treatment by clinical staff. As government and other entities consider support for the implementation of EBTs, it is important for them to consider what types of settings, infrastructures, and organizational factors should be present during the selection process to ensure investments are well spent.

Abbreviations

A-CRA:

Adolescent community reinforcement approach

CFIR:

Consolidated framework of implementation research

CSAT:

Center for substance abuse treatment

EBT:

Evidence-based treatment

EPIS:

Exploration, Practice, Implementation, and Sustainment

IPS:

Individual placement and support

M:

Mean

ORC:

Organizational readiness for change

PSAT:

Program sustainability assessment tool

SAMHSA:

Substance abuse and mental health services administration

SD:

Standard deviation

SE:

Standard error

References

  1. Bond GR, Drake RE, Becker DR, Noel VA. The IPS learning community: a longitudinal study of sustainment, quality, and outcome. Psychiatr Serv. 2016;67:864–9.

    Article  PubMed  Google Scholar 

  2. Hodge LM, Turner KM. Sustained implementation of evidence-based programs in disadvantaged communities: a conceptual framework of supporting factors. Am J Community Psychol. 2016;58:192–210.

    Article  PubMed  Google Scholar 

  3. Garner BR. Research on the diffusion of evidence-based treatments within substance abuse treatment: a systematic review. J Subst Abuse Treat. 2009;36:376–99.

    Article  PubMed  Google Scholar 

  4. Aarons GA, Green AE, Willging CE, Ehrhart MG, Roesch SC, Hecht DB, Chaffin MJ. Mixed-method study of a conceptual model of evidence-based intervention sustainment across multiple public-sector service settings. Implement Sci. 2014;9:183.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Hodge LM, Turner KM, Sanders MR, Filus A. Sustained implementation support scale: Validation of a measure of program characteristics and workplace functioning for sustained program implementation. J Behav Health Serv Res. 2016;1–22.

  6. Scheirer MA, Dearing JW. An agenda for research on the sustainability of public health programs. Am J Public Health. 2011;101:2059–67.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Ament SM, de Groot JJ, Maessen JM, Dirksen CD, van der Weijden T, Kleijnen J. Sustainability of professionals’ adherence to clinical practice guidelines in medical care: a systematic review. BMJ Open. 2015;5:e008073.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Wiltsey Stirman S, Kimberly J, Cook N, Calloway A, Castro F, Charns M. The sustainability of new programs and innovations: A review of the empirical literature and recommendations for future research. In Book The sustainability of new programs and innovations: A review of the empirical literature and recommendations for future research (Editor ed.^eds.). City: BioMed Central Ltd.; 2012.

  9. Brookman-Frazee L, Stadnick N, Roesch S, Regan J, Barnett M, Bando L, Innes-Gomberg D, Lau A. Measuring sustainment of multiple practices fiscally mandated in children’s mental health services. Adm Policy Ment Health. 2016;43:1009–22.

    Article  PubMed  Google Scholar 

  10. Cooper BR, Bumbarger BK, Moore JE. Sustaining evidence-based prevention programs: correlates in a large-scale dissemination initiative. Prev Sci. 2015;16:145–57.

    Article  PubMed  Google Scholar 

  11. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38:4–23.

    Article  PubMed  Google Scholar 

  12. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Adm Policy Ment Health. 2009;36:24–34.

    Article  PubMed  Google Scholar 

  14. Tibbits MK, Bumbarger BK, Kyler SJ, Perkins DF. Sustaining evidence-based interventions under real-world conditions: results from a large-scale diffusion project. Prev Sci. 2010;11:252–62.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Aarons GA, Green AE, Trott E, Willging CE, Torres EM, Ehrhart MG, Roesch SC. The roles of system and organizational leadership in system-wide evidence-based intervention sustainment: a mixed-method study. Adm Policy Ment Health. 2016;43:991–1008.

    Article  PubMed  Google Scholar 

  16. Peterson AE, Bond GR, Drake RE, McHugo GJ, Jones AM, Williams JR. Predicting the long-term sustainability of evidence-based practices in mental health care: an 8-year longitudinal analysis. J Behav Health Serv Res. 2014;41:337–46.

    Article  PubMed  Google Scholar 

  17. Eby LT, Burk H, Maher CP. How serious of a problem is staff turnover in substance abuse treatment? A longitudinal study of actual turnover. J Subst Abuse Treat. 2010;39:264–71.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Garner BR, Hunter BD. Examining the temporal relationship between psychological climate, work attitude, and staff turnover. J Subst Abuse Treat. 2013;44:193–200.

    Article  PubMed  Google Scholar 

  19. Garner BR, Hunter BD, Modisette KC, Ihnes PC, Godley SH. Treatment staff turnover in organizations implementing evidence-based practices: turnover rates and their association with client outcomes. J Subst Abuse Treat. 2012;42:134–42.

    Article  PubMed  Google Scholar 

  20. Godley SH, Garner BR, Smith JE, Meyers RJ, Godley MD. A large-scale dissemination and implementation model for evidence-based treatment and continuing care. Clin Psychol (New York). 2011;18:67–83.

    Google Scholar 

  21. Dennis M, Godley SH, Diamond G, Tims FM, Babor T, Donaldson J, Liddle H, Titus JC, Kaminer Y, Webb C, Hamilton N, Funk R. The Cannabis Youth Treatment (CYT) study: main findings from two randomized trials. J Subst Abuse Treat. 2004;27:197–213.

    Article  PubMed  Google Scholar 

  22. Godley MD, Godley SH, Dennis ML, Funk RR, Passetti LL. The effect of assertive continuing care on continuing care linkage, adherence and abstinence following residential treatment for adolescents with substance use disorders. Addiction. 2007;102:81–93.

    Article  PubMed  Google Scholar 

  23. Godley MD, Godley SH, Dennis ML, Funk RR, Passetti LL, Petry NM. A randomized trial of assertive continuing care and contingency management for adolescents with substance use disorders. J Consult Clin Psychol. 2014;82:40–51.

    Article  PubMed  Google Scholar 

  24. Henderson CE, Wevodau AL, Henderson SE, Colbourn SL, Gharagozloo L, North LW, Lotts VA. An independent replication of the Adolescent-Community Reinforcement Approach with justice-involved youth. Am J Addict. 2016;25:233–40.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Slesnick N, Prestopnik JL, Meyers RJ, Glassman M. Treatment outcome for street-living, homeless youth. Addict Behav. 2007;32:1237–51.

    Article  PubMed  Google Scholar 

  26. Godley SH, Smith JE, Meyers RJ, Godley MD. The Adolescent Community Reinforcement Approach: A clinical guide for treating substance use disorders. Normal: Chestnut Health Systems; 2016.

    Google Scholar 

  27. Miller WR, Yahne CE, Moyers TB, Martinez J, Pirritano M. A randomized trial of methods to help clinicians learn motivational interviewing. J Consult Clin Psychol. 2004;72:1050–62.

    Article  PubMed  Google Scholar 

  28. Sholomskas DE, Syracuse-Siewert G, Rounsaville BJ, Ball SA, Nuro KF, Carroll KM. We don’t train in vain: a dissemination trial of three strategies of training clinicians in cognitive-behavioral therapy. J Consult Clin Psychol. 2005;73:106–15.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Smith JE, Lundy L, Gianini L. Community reinforcement approach (CRA) and adolescent community reinforcement approach (A-CRA) therapist coding manual. Bloomington: Lighthouse Institute; 2007.

    Google Scholar 

  30. Green AE, Trott E, Willging CE, Finn NK, Ehrhart MG, Aarons GA. The role of collaborations in sustaining an evidence-based intervention to reduce child neglect. Child Abuse Negl. 2016;53:4–16.

    Article  PubMed  Google Scholar 

  31. Carroll C, Patterson M, Wood S, Booth A, Rick J, Balain S. A conceptual framework for implementation fidelity. Implement Sci. 2007;2:40.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38:65–76.

    Article  PubMed  Google Scholar 

  33. Substance Abuse & Mental Health Services Administration. Application information Center for Substance Abuse Treatment (CSAT). 2009. http://media.samhsa.gov/Grants/2009/ti_09_002.aspx. Accessed 26 Oct 2015.

  34. Dillman DA, Smyth JD, Christian LM, Dillman DA. Internet, mail, and mixed-mode surveys: The tailored design method. 3rd ed. Hoboken: Wiley & Sons; 2009.

    Google Scholar 

  35. Schell SF, Luke DA, Schooley MW, Elliott MB, Herbers SH, Mueller NB, Bunger AC. Public health program capacity for sustainability: a new framework. Implement Sci. 2013;8:15.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Washington University in St Louis. Program sustainability assessment tool. 2012. https://sustaintool.org/. Accessed 27 Oct 2015.

  37. Substance Abuse and Mental Health Services Administration. National Survey of Substance Abuse Treatment Services (N-SSATS): 2010. Data on substance abuse treatment facilities. Rockville: U.S. Dept. of Health and Human Services Substance Abuse and Mental Health Services Administration Center for Behavioral Health Statistics and Quality; 2011.

    Google Scholar 

  38. Lehman WE, Greener JM, Simpson DD. Assessing organizational readiness for change. J Subst Abuse Treat. 2002;22:197–209.

    Article  PubMed  Google Scholar 

  39. Aarons GA, Ehrhart MG, Farahnak LR. The Implementation Leadership Scale (ILS): development of a brief measure of unit level implementation leadership. Implement Sci. 2014;9:45.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Klein KJ, Sorra JS. The challenge of innovation implementation. Acad Manage Rev. 1996;21:1055–80.

    Google Scholar 

  41. Rogers EM. Diffusion of innovations. 5th ed. New York: Free Press; 2003.

    Google Scholar 

  42. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82:581–629.

    Article  PubMed  PubMed Central  Google Scholar 

  43. Steckler A, Goodman RM, McLeroy KR, Davis S, Koch G. Measuring the diffusion of innovative health promotion programs. Am J Health Promot. 1992;6:214–24.

    Article  CAS  PubMed  Google Scholar 

  44. O’Loughlin J, Renaud L, Richard L, Gomez LS, Paradis G. Correlates of the sustainability of community-based heart health promotion interventions. Prev Med. 1998;27:702–12.

    Article  PubMed  Google Scholar 

  45. Hunter SB, Ayer L, Han B, Garner BR, Godley SH. Examining the sustainment of the Adolescent-Community Reinforcement Approach in community addiction treatment settings: protocol for a longitudinal mixed method study. Implement Sci. 2014;9:104.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Hedeker D, Gibbons RD. Application of random-effects pattern-mixture models for missing data in longitudinal studies. Psychol Methods. 1997;2:64–78.

    Article  Google Scholar 

  47. Hogan JW, Laird NM. Model-based approaches to analysing incomplete longitudinal and failure time data. Stat Med. 1997;16:259–72.

    Article  CAS  PubMed  Google Scholar 

  48. Benjamini Y, Yekutieli. D. The control of the false discovery rate in multiple testing under dependency. Annals of statistics. 2001;29(4):1165–1188.

  49. Scheirer MA. Is sustainability possible? A review and commentary on empirical studies of program sustainability. Am J Eval. 2005;26:320–47.

    Article  Google Scholar 

  50. Goodman RM, McLeroy KR, Steckler AB, Hoyle RH. Development of level of institutionalization scales for health promotion programs. Health Educ Q. 1993;20:161–78.

    Article  CAS  PubMed  Google Scholar 

  51. Hunter SB, Han B, Slaughter ME, Godley SH, Garner BR. Associations between implementation characteristics and evidence-based practice sustainment: a study of the Adolescent Community Reinforcement Approach. Implement Sci. 2015;10:173.

    Article  PubMed  PubMed Central  Google Scholar 

  52. Substance Abuse and Mental Health Services Administration. Office of applied studies. Treatment episode data set (TEDS): 2007. Rockville: U.S. Dept. of Health and Human Services Substance Abuse and Mental Health Services Administration Center for Behavioral Health Statistics and Quality; 2007.

    Google Scholar 

  53. Godley SH, Meyers RJ, Smith JE, Godley MD, Titus J, Karvinen T, Dent G, Passetti L, Kelberg P. The adolescent community reinforcement approach (ACRA) for adolescent cannabis users. Rockville: Center for Substance Abuse Treatment, Substance Abuse and Mental Health Services Administration; 2001.

    Google Scholar 

  54. Hupp C, Mertig K, Krall K, Godley MD, Godley SH. Adolescent community reinforcement approach (A-CRA) and assertive continuing care (ACC) supervisor rating manual. Normal: Chestnut Health Systems; 2009.

    Google Scholar 

Download references

Acknowledgements

We thank Chau Pham for the data collection management, Colleen McCullough for the data analyses management, Tiffany Hruby for the assistance with the manuscript preparation. Finally, the authors would like to thank all of the participating staff at the adolescent treatment programs we contacted for whom this research would not be possible.

Funding

This work was supported by the National Institute on Alcohol Abuse and Alcoholism (NIAAA) grant R01AA021217 to Sarah B. Hunter. The content is solely the responsibility of the authors and does not necessarily represent the official views of NIAAA or the National Institutes of Health.

Availability of data and materials

The datasets used during the current study are available from the corresponding author.

Authors’ contributions

SBH, BH, BRG, and SHG conceptualized the study. SBH is the PI and has overall responsibility for the execution of the project. BH conceptualized the study’s analytical plan, and MES assisted in its execution. All authors were involved in developing and editing of the manuscript and have given final approval of the submitted version.

Competing interests

Authors Sarah B. Hunter, Bing Han, Mary E. Slaughter, Susan H. Godley, and Bryan R. Garner declare no competing interests.

Consent for publication

Not applicable.

Ethics approval and consent to participate

The study was approved by the Principal Investigator’s Institutional Review Board (Federalwide Assurance Number: 00003425), and informed consent was received from all participants prior to data collection.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sarah B. Hunter.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hunter, S.B., Han, B., Slaughter, M.E. et al. Predicting evidence-based treatment sustainment: results from a longitudinal study of the Adolescent-Community Reinforcement Approach. Implementation Sci 12, 75 (2017). https://doi.org/10.1186/s13012-017-0606-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-017-0606-8

Keywords