Skip to main content

Advertisement

A methodology for generating a tailored implementation blueprint: an exemplar from a youth residential setting

Article metrics

Abstract

Background

Tailored implementation approaches are touted as more likely to support the integration of evidence-based practices. However, to our knowledge, few methodologies for tailoring implementations exist. This manuscript will apply a model-driven, mixed methods approach to a needs assessment to identify the determinants of practice, and pilot a modified conjoint analysis method to generate an implementation blueprint using a case example of a cognitive behavioral therapy (CBT) implementation in a youth residential center.

Methods

Our proposed methodology contains five steps to address two goals: (1) identify the determinants of practice and (2) select and match implementation strategies to address the identified determinants (focusing on barriers). Participants in the case example included mental health therapists and operations staff in two programs of Wolverine Human Services. For step 1, the needs assessment, they completed surveys (clinician N = 10; operations staff N = 58; other N = 7) and participated in focus groups (clinician N = 15; operations staff N = 38) guided by the domains of the Framework for Diffusion [1]. For step 2, the research team conducted mixed methods analyses following the QUAN + QUAL structure for the purpose of convergence and expansion in a connecting process, revealing 76 unique barriers. Step 3 consisted of a modified conjoint analysis. For step 3a, agency administrators prioritized the identified barriers according to feasibility and importance. For step 3b, strategies were selected from a published compilation and rated for feasibility and likelihood of impacting CBT fidelity. For step 4, sociometric surveys informed implementation team member selection and a meeting was held to identify officers and clarify goals and responsibilities. For step 5, blueprints for each of pre-implementation, implementation, and sustainment phases were generated.

Results

Forty-five unique strategies were prioritized across the 5 years and three phases representing all nine categories.

Conclusions

Our novel methodology offers a relatively low burden collaborative approach to generating a plan for implementation that leverages advances in implementation science including measurement, models, strategy compilations, and methods from other fields.

Background

Training and consultation are an empirically supported, integral set of strategies for implementing psychosocial interventions such as cognitive behavioral therapy (CBT). However, mounting literature suggests that training plus consultation, as a standalone implementation intervention, is unlikely to achieve full penetration and agency-wide long-term sustainment [2]. Complex settings, such as residential treatment, where team-based care is the primary mode of care delivery are likely to require additional strategies such as reorganization of clinical teams, revision of job descriptions, and engagement of clinical champions to fully integrate a new evidence-based psychosocial interventions (EBPs). With the demand for EBPs growing, it is important to determine how best to support their uptake, implementation, and sustainment to maximize the investment of community partners.

Tailoring strategies to the contextual factors that influence the implementation process is touted as critical to achieving success [3, 4]; however, few concrete examples of tailoring methods exist. Blueprints could arguably be tailored to the program level, site level, or organization level, depending on interest and resources. Sites within an organization can have significantly different contexts that may lead to unique tailored blueprints; however, it is unclear whether program level tailored blueprints are needed. Rather, tailoring strategies to the context has been recently described as the “black box” of implementation [5]. Revealing what is inside the black box is critical to advancing the science and practice of implementation and ensuring that successful implementation efforts are replicable.

A methodology for prospectively tailoring strategies has the potential to reveal a formal implementation blueprint, which Powell et al. [6] define as a plan that includes all goals and strategies, the scope of change, with a timeframe and milestones, as well as appropriate performance/progress measures. Two goals can be used to organize the steps in a tailoring methodology that yield a blueprint. The first goal is to identify determinants of practice, or the barriers and facilitators, of an organization. Krause et al. [7] recently compared brainstorming (open versus questionnaire guided) among health professionals, as well as interviews of patients and professionals, as methods for achieving this goal. They concluded that no single method emerged as superior and instead they recommend that a combined approach be used.

The second goal is to select and match implementation strategies to address the identified determinants (typically focusing on barriers) across the phases of pre-implementation (or exploration + preparation; [8]), implementation, and sustainment. Powell et al. [9] provide some methodological guidance in a recent conceptual paper for how to achieve this goal. Among their examples is conjoint analysis, which is described as having the advantages of providing a clear step-by-step method for selecting and tailoring strategies to unique settings that forces stakeholders to consider strategy-context match at a granular level. Importantly, Huntink et al. [10] found little to no difference in strategy generation across stakeholder categories (e.g., researchers, quality officers, health professionals) suggesting that involvement of stakeholders is important but equal representation and contribution is not necessary for sound strategy selection. Although literature is becoming available to aid in elucidating many of the critical steps to achieving these goals of determinant identification and strategy selection, no study has illustrated a method for establishing a formal tailored implementation blueprint.

The objective of this paper is to put forth a novel, empirically driven, collaborative methodology for generating a formal tailored blueprint using a CBT implementation at two sites of a youth residential setting as an exemplar. We will apply a model-driven, mixed methods approach to a “needs assessment” to identify the determinants of practice (focusing on barriers) and then pilot a modified conjoint analysis method to collaboratively generate an implementation blueprint. We will use a case study to describe this methodology in order to position others to engage in a structured, stakeholder-driven approach to prospectively build a tailored plan for implementation.

Methods

Formation of the academic-community partnership

Developing a tailored implementation blueprint would be challenging outside of an academic-community partnership. There are numerous models for how to initiate, operate, and sustain partnerships focused on EBP implementation (e.g., [11]). In our case example, administrators of a large Midwestern residential youth service treatment center, Wolverine Human Services, identified CBT as the optimal intervention to be implemented in conjunction with existing services after exploring available evidence-based practices on SAMHSA’s National Registry of Evidence-based Programs and Practices [12] and attending a series of workshops hosted by the Beck Institute. Based on typical implementation project timelines, we established a 5-year contract between the residential setting (key stakeholders included the Director of Clinical and Quality Services and the Vice President of Treatment Programs), the Beck Institute (purveyor), and implementation researchers to collaboratively pursue a 3-stage (i.e., pre-implementation, implementation, sustainment) process to integrate CBT into all levels of care provision. Drawing upon a community-based participatory research approach, all parties contributed to every phase described in detail below. Although this project was designed and funded for the purposes of achieving the clinical program change (i.e., CBT implementation), all parties were in support of contributing to the implementation science knowledge base. The Indiana University Institutional Review Board approved all study procedures.

Participants

In the current case example, given the community partner’s implementation budget at the time the project was conceptualized, only two of six programs were selected for participation in the pilot endeavor. The two programs included a non-secure (unlocked, voluntary and mandated treatment) and secure (equipped with perimeter fencing, audio and video monitoring, locked, mandated treatment) program of Wolverine Human Services. Young people between the ages of 12 to 18 with a history of severe co-occurring mental health disorders and related behavioral/delinquency challenges (e.g., sexual, violent, and drug offenses) reside in the secure site, while the other site operates as a non-secure residential treatment program with youth in placement subsequent to truancy, and/or failure of other placements such as foster care. Typical mental health presenting problems include anxiety, mood disorders, oppositional defiant disorder and conduct disorders, reactive attachment and intermittent explosive disorder, bipolar, posttraumatic stress disorder, borderline personality disorder, with the majority struggling with externalizing problems, and comorbid disorders.

Participants contributing data to inform the tailored blueprint should represent all relevant stakeholders (e.g., clinical, operations, administrative). Participants in the case study consisted of staff stakeholders across levels at the two programs, including operations staff (i.e., youth care workers, shift/care coordinators), therapists, and leaders (e.g., managers, directors). The majority of staff (N = 76) shared their experience via surveys (therapists n = 10, operations staff n = 58, and other staff n = 7; one participant’s position was missing), and a subset of staff (N = 53) participated in one of seven focus groups (therapists n = 15, operations staff n = 38).

Therapists are typically the target of EBP implementation; at Wolverine Human Services, they are the highest ranking member of the treatment team and they are responsible for ensuring that the client’s needs are met by all staff. Therapists at Wolverine Human Services provide on-site education and training to the operation staff. Therapists provide family therapy/education at least monthly, and they conduct the initial and quarterly clinical assessments for each client. Therapists report directly to the court systems (juvenile justice or family/foster care) on the progress and needs of the youth. Therapists also provide individual and group therapy to the youth at the residential settings.

Operation staff is rarely the target of EBP implementation as they are not typically responsible for clinical care. However, Wolverine Human Services believed that to implement CBT with only therapists could limit or even undermine its impact. Operations staff is composed of youth care workers, who provide the majority of care and day-to-day youth oversight in the residential centers, and safety and support specialists, who coordinate staff and client assignments within programs of the organization as well as train staff and administer medications to youth.

Procedures

Our tailoring methodology consisted of five steps: (1) needs assessment data collection, (2) mixed methods analysis, (3a) determinant identification (focusing on barriers), (3b) implementation strategy selection, (4) implementation team formation, and (5) implementation blueprint creation; see Fig. 1. What follows are the methods and results from our case study applying these five steps to generate a tailored implementation blueprint for CBT implementation in Wolverine Human Services.

Fig. 1
figure1

Tailoring methodology steps. This figure demonstrates the five steps of the tailoring methodology, grouped by goals and site visits

Step 1: needs assessment data collection

The needs assessment that leads to determinant identification can be conducted via numerous methods, each of which is likely to yield a different set and quantity of barriers and facilitators [7]. For this reason, mixed methods are encouraged. Determinant identification is likely to be optimized and streamlined if probing is guided by one of the 60+ models, frameworks, and theories for implementation [5, 15]. We selected the Framework for Dissemination to guide the quantitative and qualitative data collection in our case example because it was generated from the best available implementation research and an academic-community partnership that offered a relatively parsimonious set of six contextual domains for targeted analysis: (a) norms and attitudes (e.g., knowledge, expectations), (b) structure and process (e.g., mission, size), (c) resources (e.g., financial, physical, human, social, political capital), (d) policies and incentives (e.g., rewards or sanctions), (e) networks and linkages (e.g., conduits among stakeholders that enable information flow), and (f) media and change agents (e.g., active sources of information) [1]. As shown in the parenthetical examples, the first three domains capture the willingness and ability of key stakeholders to both implement and sustain new interventions, whereas the second three domains reflect sources of information and influence.

We administered quantitative measures assessing putative determinants of practice (e.g., attitudes towards evidence based practices, organizational climate, culture, leadership) mapping onto the Framework for Dissemination [1] to staff as part of the pre-implementation activities during the first site visit. The quantitative measures were selected from a database built by the first author through a National Institute of Mental Health funded systematic review of implementation measures [16]; the selected measures are listed and described in Table 1. Participants were excluded from this report only if they did not provide informed consent to allow data to be used for the purposes of research or if they had incomplete survey data on the majority of the measures. Participants generated a personal codename for their survey so that their responses could be anonymous but the research team could link responses over time. We calculated descriptive statistics (means, standard deviations, frequencies) for all applicable scale and total scores.

Table 1 Quantitative measures

Administrators nominated therapists and operations staff to participate in focus groups using purposeful sampling procedures highlighted by Palinkas and colleagues [13] to maximize the presence of both positive and negative viewpoints regarding CBT implementation. Each focus group included four to eight staff to allow all participants to share their viewpoints [17]. We conducted seven, 90-min focus groups (moderated by the first author; CCL) to elicit attitudes and beliefs from staff regarding the domains of the Framework for Dissemination [1]. In addition, focus groups discussed past implementation efforts, the effectiveness of implementation strategies used previously, the structure of usual care, and how CBT could fit into current practices. All focus groups were audio recorded, de-identified, and transcribed by a member of the research team.

We chose qualitative analysis procedures to evaluate the themes endorsed by focus group participants within the context of the Framework for Dissemination model domains [1]. We divided all focus group transcripts first into units of meaning to allow qualitative codes to represent specific thematic discussions of varying lengths. We then developed a coding dictionary based on the six domains identified in the Framework for Dissemination [1]; each of the six domains served as an overarching code for the focus group transcripts. Additional codes were added to the coding dictionary based on emergent topics identified in the focus group transcripts, which revealed one additional overarching code (or domain), communication. Fifty-six subcodes (i.e., codes to further clarify the overarching Framework for Dissemination domains) were also identified. Cross-cutting codes that spanned multiple overarching codes and subcodes included positive and negative valence statements, past, present, and future, desire for something (e.g., desire for training), and impact of a discussion topic on adolescents. Typical coder training and coding processes were enacted, the details of which can be found in Additional file 1.

Step 2: mixed methods analysis

Mixed methods analyses followed the QUAN + QUAL structure to serve the function of convergence and expansion in a connecting process [18], to first assess whether determinants emerged as barriers or facilitators based on quantitative norms (above or below, respectively) or frequent positive or negative endorsement in the focus group data (i.e., convergence), and second to gain additional insight into the nature of each barrier/facilitator (i.e., how does each barrier or facilitator manifest within sites) via the qualitative data (i.e., expansion). The quantitative data captured four of the six Framework for Dissemination (i.e., Norms & Attitudes, Structure & Process, and Networks & Linkages; Mendel et al. 2008) domains, while the qualitative data captured all six Framework for Dissemination domains. To generate the implementation blueprint, our team sought focused the determinant analysis to identify the barriers as they would require active strategies whereas we took note of facilitators and planned to leverage them wherever possible.

Conjoint analysis

Conjoint analysis is a quantitative method wherein stakeholders evaluate and assign different values to product attributes, services, interventions, etc. [14]; this process helps clarify competing features or trade-offs to inform decisions about what should be of focus in the design of a product. Applied to the generation of a tailored implementation blueprint (i.e., a product), conjoint analysis can engage stakeholders in the clarification and prioritization of barriers and selection of strategies to enhance implementation. Stakeholder values can be revealed via various rating and sorting activities, ranging from consideration of feasibility and importance to assigning 100 priority points across available implementation strategies.

Step 3: conjoint analysis

3a: barrier identification

In a second site visit, agency administrators rated the barriers identified from the mixed methods analysis on feasibility (i.e., high or low perceived ability to change the barrier) and on importance (i.e., high or low). To enhance the accessibility of this methodology, we wrote each of the 76 barriers on 3 × 5 note cards with either example items from the quantitative measures or code definitions from the qualitative data. We then presented to the administrators each notecard one at a time and asked them to place the barrier on a 2 × 2 poster board grid that had importance on the Y-axis and feasibility on the X-axis. Barriers that were assigned high feasibility and high importance ratings, such as staff CBT knowledge, were considered top priority.

Step 3b: implementation strategy selection

Potential implementation strategies were selected for the set of prioritized barriers in a matching process done collaboratively among researchers, administrators, and staff. Strategies were selected from the compilation generated by [6] in a review and synthesis including 68 unique strategies. Each strategy and its definition were presented to the stakeholders who rated strategy feasibility (i.e., high or low) given their budget and other resources. Subsequently, the CBT purveyor and the implementation researchers (who also had CBT expertise) rated the degree to which the strategy was integral to CBT adherence on a scale of 1 for low impact (i.e., not required for adherence), 2 for moderate impact (i.e., high priority for adherence), or 3 for high impact (i.e., a core element of CBT adherence [19]. Strategies were selected if they were rated as either high impact or high feasibility or both and then matched with one or more of the identified barriers based on the strategy’s potential mechanism of action.

Step 4: implementation team formation

Two implementation teams were formed, one for each program. These teams consisted of approximately 10 opinion leaders and champions representing all staff levels at each program (i.e., both therapists and operations staff). Members of the implementation team were selected based on a sociometric measure [20] (to identify opinion leaders) and the Evidence-Based Practice Attitudes Scale [21] (to identify champions) completed by therapists and operation staff during the needs assessment. Roles on the implementation team included Chair, Secretary, Program Evaluator, Incentives Officer, and Communication Officer (not all members held a formal role). These officer positions were chosen as they were the minimum set with clear responsibilities that would allow the team to get up and running and maintain itself over time; see Table 2. Implementation teams have a variety of functions including moving the project through the implementation, problem solving sustainability, and engaging stakeholders and increasing buy-in [22, 23]. In the case example, the main goals of the implementation teams were to oversee implementation strategy enactment and serve as in-house experts (both implementation & CBT).

Table 2 Officer positions and role responsibilities

The CBT purveyor and implementation researchers met with the implementation teams from each of the two residential programs to provide an overview of the roles, functions, and responsibilities of the implementation teams. We asked teams to generate potential incentives for being members. Next, we presented the implementation teams with an overview of the prioritized barriers, specific examples that emerged from the needs assessment, and their implications for successful CBT implementation based on the latest research. We gave the implementation teams the opportunity to brainstorm additional implementation strategies that could be used to improve on or overcome the identified barriers. This step was important to ensure team members’ input, and language was incorporated into the final blueprint. Finally, the teams generated three top goals to guide their work during the pre-implementation period.

Step 5: implementation blueprint creation

We then organized the top-rated strategies into a 3-phase Implementation Blueprint guided by the implementation science literature and consensus among the CBT purveyor and implementation researchers regarding timing for optimal impact [9]. That is, we decided that it would not be feasible or wise to begin all strategies simultaneously but that some strategies (e.g., form implementation team) ought to be completed before others (e.g., training) are initiated. This implementation blueprint included key strategies that the team agreed would be implemented across pre-implementation, implementation, and sustainment phases, thereby providing a roadmap for the implementation effort. Final steps included a meeting between administrators, the CBT purveyor, and implementation researchers to review the Implementation Blueprint, discuss the roles and responsibilities of the implementation teams and consultants, and confirm the timeline.

Results

Therapists were primarily female (n = 7, 70.0%), African American (n = 6, 60.0%) or Caucasian (n = 4, 40.0%), and had an average age of 35.7 (SD = 7.06). Most of the therapists had a Master’s degree (n = 9, 90.0%) and indicated Behavior Modification as their primary theoretical orientation (n = 7, 70.0%). Operation staff was predominantly males (n = 35, 60.3%), African Americans (n = 35, 60.3%) with a mean age of 36.7 (SD = 9.75). Similar to therapists, the majority of operations staff cited Behavior Modification as their primary theoretical orientation (n = 35, 60.3%). Operations staff had some college but no degree (n = 20, 34.5%) or a bachelor’s degree (n = 18, 31.0%).

The needs assessment (step 1) and mixed methods analysis (step 2) revealed 76 unique barriers. Exemplar barriers included climate and morale, communication, training, and teamwork and conflict. Three of the six Framework for Dissemination [1] domains captured by the quantitative measures revealed differences between Wolverine Human Services and national norms, suggesting these domains to be potential barriers or facilitators. In contrast, all six domains of the Framework for Dissemination were discussed as either barriers or facilitators in the qualitative data. See Additional file 2 for details regarding the barriers and facilitators, including quantitative results and exemplar quotes from staff. Conjoint analysis (steps 3a and 3b) prioritized (i.e., high feasibility and high importance) 23 of the 76 unique identified barriers, and 36 implementation strategies were selected (i.e., high impact or high feasibility or both) to address these barriers. Implementation teams were then formed to oversee implementation strategy enactment and serve as in-house experts (step 4). Finally, an implementation blueprint was compiled (step 5) in which the top-rated implementation strategies were organized across phases of implementation. Tables 3, 4, and 5 reflect the tailored blueprint for the pre-implementation, implementation, and sustainment phases, respectively, of the Wolverine Human Services CBT implementation project. As can be seen, 14 discrete strategies were selected for the pre-implementation phase (year 1), 19 for the implementation phase (years 2–4), and 12 for the sustainment phase (year 5). Using Waltz et al.’s [24] expert concept mapping solution for organizing strategies into conceptually distinct categories, pre-implementation included strategies across six of the nine categories: develop stakeholder relationships (n = 5, 35.7%), train and educate stakeholders (n = 3, 21.4%), support clinicians (n = 1, 7.1%), adapt and tailor to context (n = 1, 7.1%), use evaluative and iterative strategies (n = 1, 7.1%), and utilize financial strategies (n = 1, 7.1%). The implementation phase included strategies across eight of the nine categories: train and educate stakeholders (n = 5, 26.3%), support clinicians (n = 4, 21.0%), change infrastructure (n = 4, 21.0%), develop stakeholder relationships (n = 3, 15.8%), use evaluative and iterative strategies (n = 2, 10.5%), provide interactive assistance (n = 2, 10.5%), adapt and tailor to context (n = 1,5.3%), and utilize financial strategies (n = 1, 5.3%). Finally, the sustainment phase included strategies across six of the nine categories: train and educate stakeholders (n = 4, 33.3%), use evaluative and iterative strategies (n = 3, 25.0%), develop stakeholder relationships (n = 2, 16.7%), provide interactive assistance (n = 1, 8.3%), engage consumers (n = 1, 8.3%), and utilize financial strategies (n = 1, 8.3%).

Table 3 Blueprint for pre-implementation
Table 4 Blueprint for implementation
Table 5 Blueprint for sustainment

Discussion

This study put forth a collaborative, model-based, mixed qualitative- and quantitative-informed methodology for developing a blueprint to guide the work of implementation teams across pre-implementation, implementation, and sustainment. We applied the methodology in the context of a youth residential center that sought to implement cognitive behavior therapy (CBT). Seventy-six unique barriers to implementation emerged from a mixed methods needs assessment guided by the Framework for Dissemination [1]. A modified, low technology conjoint analysis revealed 23 barriers that agency administrators perceived to be of high importance and highly feasible to address (e.g., teamwork, training, morale, communication). After forming implementation teams consisting of influential staff representatives per social network data, 36 strategies were matched to the prioritized barriers based on their feasibility and potential impact on implementing CBT with fidelity. The strategies were mapped across the phases of pre-implementation (e.g., develop implementation glossary, conduct local consensus discussions, modify context to prompt new behaviors), implementation (e.g., conduct educational outreach visits, remind clinicians, shadow experts), and sustainment (e.g., use train-the-trainer strategies, revise job descriptions) and assigned to either (or both) the implementation teams or purveyors for completion. The resulting blueprint highlights overarching goals of each phase, priorities (high versus low), a timeline, and concrete strategies agreed upon by all stakeholders, the purveyor, and researchers to serve as a roadmap to guide the collaborative effort. This methodology leverages the best available research, clarifies the subsequent process and roles for stakeholders, allows for cost mapping of high (e.g., regularly engage implementation teams) versus low (e.g., access new funding) priority implementation strategies, and ensures strategies target the unique contextual barriers of a site.

With respect to the first goal of this methodology—i.e., prioritize barriers––it remains an empirical question as to how different the identified barriers would have been if another model had guided the quantitative and qualitative evaluation. That is, although the Framework for Dissemination [1] was selected as the model to guide the needs assessment in this case example, this methodology could be employed with a variety of different frameworks such as the Consolidated Framework for Implementation Research (CFIR; [25], the Exploration Preparation Implementation Sustainment (EPIS; [8]) model, or the Dynamic Sustainability Framework [26]. Indeed, there are over 60 dissemination and implementation frameworks, theories, and models in use, each with unique and overlapping constructs identified as potential contextual barriers to the implementation process [15], which will influence measurement and potentially the barriers identified. For instance, the Framework for Dissemination acknowledges the importance of opinion leaders but does not emphasize broader leadership impacts whereas the EPIS model explicitly highlights both senior leadership buy-in and team-level leadership in the needs assessment phase. Fortunately, there are ongoing efforts to map quantitative measures to the CFIR, which is the most comprehensive taxonomy of constructs, and provide information on the degree to which these measures are pragmatic (e.g., low cost, brief, actionable) and psychometrically strong [16]. There is also an evolving website that lists common constructs, associated measures, and information on the models identified by Tabak and colleagues in their review [27]. Both of these emerging resources will prove useful for identifying barriers in future research. Moreover, use of frameworks will help to focus the assessment on key constructs that yield a manageable set of barriers that can otherwise quickly become unwieldy (e.g., over 600 determinants) if open brainstorming is used as the method [7]. Until we know more about robust contextual predictors or mediators of implementation, comprehensive assessments of this nature will likely need to be conducted.

With respect to the second goal of this methodology—i.e., match strategies to barriers—there are numerous structured processes emerging (e.g., conjoint analysis, intervention mapping, group model building) that could be applied [5]; however, all require some knowledge of available implementation strategies. Powell et al. [6, 9] offer a compilation of discrete implementation strategies that can serve as a menu from which to select options. In the current methodology, stakeholder perception of strategy feasibility and purveyor expertise regarding strategy impact on CBT fidelity guided the selection process and implementation experts refined the matching to barriers and staging across implementation phases. There is little empirical guidance regarding which strategies target which barriers, how strategies act to create change (i.e., mechanisms), and at which point in the implementation process a strategy may be most potent. Recent work on “Expert Recommendations for Implementing Change” endeavors to address some of these gaps in the literature [19]. They recently produced a taxonomy of implementation strategies, a conceptual map of strategies across nine categories [24] and results from an expert-led hypothetical matching exercise. Results are forthcoming for their expert-led mapping of strategies to implementation phase for their hypothetical scenarios. We applied the results of their work to date in the current methodology to summarize the blueprint in implementation strategy terms and found that although all nine categories were included across one of the three phases, some were observed in high frequency across phases (e.g., develop stakeholder interrelationships, train, and education stakeholders) whereas others emerged only once (e.g., engage consumers). We do not yet know if this distribution was optimal or if the strategies selected effectively targeted the key determinants in these programs. In a recent publication, Boyd and colleagues found that certain strategy categories (i.e., financial strategies) were significantly related to implementation outcomes whereas others were not [28]. More research is needed to enhance our proposed methodology with empirical guidance for strategy selection and matching.

We made significant modifications to the traditional conjoint analysis process in order to lower the cost and increase accessibility. For instance, we opted for note cards and poster board versus Sawtooth Software packages [29] to prioritize barriers. Although this may have compromised the rigor of the method, there is concern that a divide is growing between best practices from implementation science and what is actually happening (and feasible) in the practice world [16, 30]. For example, concept mapping is gaining empirical support as an effective method for engaging stakeholders in systematically prioritizing barriers [31]. However, concept mapping is resource intensive (e.g., electronic devices are needed for survey completion, special software needed for data collection and analysis) and requires content and methodological expertise (i.e., to develop the content for the software, analyze, and interpret the data). Also, access to the internet is not ubiquitous especially in rural communities such as the youth residential center in which our case example occurred. Moreover, although it is relatively more feasible than concept mapping, the proposed methodology has associated personnel burden insofar as the quantitative surveys require some level of analysis (e.g., descriptive statistics) and possible comparison (e.g., calculation of effect size in comparison to norms or to a neutral midpoint) and the qualitative data require skilled personnel for formally coding, as well as the cost of transcriptions. With respect to the quantitative assessment component, there is a relatively new effort to develop “pragmatic measures” that could be used independently by stakeholders to inform an implementation effort [16]. With respect to insights gained in a more qualitative manner, it may be that rapid ethnography is a viable alternative [32]. Regardless, tailoring strategies to the context is inherently more work than a standardized approach.

There are several differences between our work and that of the Tailored Implementation for Chronic Disease (TICD) [10] including the target disease/disorder (behavioral health versus chronic health conditions), setting (small, rural residential care center versus large healthcare organizations), participants (providers and staff with little education beyond high school versus doctorate level professionals), and country in which the studies were conducted (USA versus five European countries). There were also differences in our methodologies, including that we presented stakeholders with implementation strategies (based on a published compilation [6]) that they then rated whereas they encouraged open brainstorming followed by a structured interview that brought evidence from implementation science to bear. Despite these differences, one similarity in our methodologies is that barriers were only assessed and prioritized once at the beginning of the process to inform the tailored approach to implementation. It may be the case that an approach more aligned with continuous quality improvement would lead to better results, although this would also demand more resources. In our current methodology, we do reassess contextual factors every six to 12 months to determine if any barriers have re-emerged.

There are several noteworthy limitations to the current study. First, the methodology was piloted in a youth residential center, the results of which may have limited generalizability to other settings. Second, we did not work with the two programs (secure and non-secure) of the residential center separately to see if different blueprints would have emerged because it was the preference of the administrators to keep some level of continuity/standardization in the project despite acknowledgement that different barriers emerged between the programs. Third, the blueprints yielded a rich and complex suite of implementation strategies for enactment across the phases of implementation (i.e., 45 unique strategies were selected across nine categories). If the goal had been to generate the most parsimonious approach, it is unclear which strategies were necessary to address which determinants and when. Fourth, the blueprints focused on barriers and did not include facilitators despite that our needs assessment contained this type of information. It is an empirical question whether targeting barriers is more or less efficient than a combined approach (target barriers and leverage facilitators). Fifth, although our preliminary results suggest that the enactment of the pre-implementation blueprint was very successful in ameliorating the prioritized barriers [33], we are unable to report on the effectiveness of the strategies across the implementation and sustainment phases because the project is ongoing and no comparison conditions exist. Finally, this approach necessitates prospective tailoring, which assumes the context is static, a fallacy that is highlighted in the Dynamic Sustainability Framework [26]. Empirical work is needed to separate out the differential impacts of one-time prospective tailoring versus iterative adaptations over time. In our annual reassessment of Wolverine Human Services, we have observed that some barriers re-emerge and need to be addressed and that new implementation strategies are prioritized that we did not plan for in our initial blueprint.

Conclusions

Despite the apparent need to select and match strategies to fit the implementation context, few methodologies exist. In this paper, we presented a model-based, mixed qualitative, and quantitative methodology for generating a tailored blueprint by achieving two goals: (1) prioritizing barriers and (2) matching strategies to prioritized barriers. Our methodology is being applied to an implementation of CBT in youth residential settings and this process yielded a manageable set of prioritized barriers and a clear plan for which strategies to engage by whom and when in the process via an implementation blueprint. Collaboratively developed blueprints from a methodology of this nature offer numerous benefits, notably around transparency and stakeholder ownership of the process. We will examine the effectiveness of the resulting blueprint on implementation outcomes as the process unfolds using an interrupted time series design. However, research is needed to determine whether this methodology outperforms other approaches.

Abbreviations

CBT:

Cognitive behavioral therapy

CFIR:

Consolidated Framework for Implementation Research

EBPs:

Evidence-based psychosocial interventions

EPIS:

Exploration Preparation Implementation Sustainment

References

  1. 1.

    Mendel P, Meredith L, Schoenbaum M, Sherbourne C, Wells K. Interventions in organizational and community context: a framework for building evidence on dissemination and implementation in health services research. Admin Pol Ment Health. 2008;35 https://doi.org/10.1007/s10488-007-0144-9.

  2. 2.

    Beidas RS, Edmunds JM, Marcus SC, Kendall PC. Training and consultation to promote implementation of an empirically supported treatment: a randomized trial. Psychiatr Serv. 2012;63(7):660–5. https://doi.org/10.1176/appi.ps.201100401.

  3. 3.

    Lewis CC, Scott K, Marti CN, Marriott BR, Kroenke K, Putz JW, Mendel P, et al. Implementing Measurement-Based Care (iMBC) for depression in community mental health: a dynamic cluster randomized trial study protocol. Implement Sci. 2015;10:127. https://doi.org/10.1186/s13012-015-0313-2.

  4. 4.

    Baker R, Camosso-Stefinovic J, Gillies C, Shaw EJ, Cheater F, Flottorp S, Robertson N, et al. Tailored interventions to address determinants of practice. Cochrane Database Syst Rev. 2015;4 https://doi.org/10.1002/14651858.CD005470.pub3.

  5. 5.

    Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, Mandell DS. Methods to improve the selection and tailoring of implementation strategies. J Behav Health Serv Res. 2017;44(2):177–94. https://doi.org/10.1007/s11414-015-9475-6.

  6. 6.

    Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, Glass JE, et al. A compilation of strategies for implementing clinical innovations in health and mental health. Med Care Res Rev. 2012;69(2):123–57. https://doi.org/10.1177/1077558711430690.

  7. 7.

    Krause J, Van Lieshout J, Klomp R, Huntink E, Aakhus E, Flottorp S, Jaeger C, et al. Identifying determinants of care for tailoring implementation in chronic diseases: an evaluation of different methods. Implement Sci. 2014;9:102. https://doi.org/10.1186/s13012-014-0102-3.

  8. 8.

    Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Admin Pol Ment Health. 2011;38(1):4–23. https://doi.org/10.1007/s10488-010-0327-7.

  9. 9.

    Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, Proctor EK, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:21. https://doi.org/10.1186/s13012-015-0209-1.

  10. 10.

    Huntink E, van Lieshout J, Aakhus E, Baker R, Flottorp S, Godycki-Cwirko M, Jäger C, et al. Stakeholders’ contributions to tailored implementation programs: an observational study of group interview methods. Implement Sci. 2014;9(1):185. https://doi.org/10.1186/s13012-014-0185-x.

  11. 11.

    Wallerstein N, Duran B. Community-based participatory research contributions to intervention research: the intersection of science and practice to improve health equity. Am J Public Health. 2010;100(S1):S40–S6. https://doi.org/10.2105/ajph.2009.184036.

  12. 12.

    Substance Abuse and Mental Health Services Administration (SAMHSA). National Registry of Evidence-based Programs and Practices (NREPP). 2016. https://www.samhsa.gov/nrepp. Accessed 1 Jun 2017.

  13. 13.

    Palinkas LA, Horwitz SM, Green CA, Wisdom JP, Duan N, Hoagwood K. Purposeful sampling for qualitative data collection and analysis in mixed method implementation research. Admin Pol Ment Health. 2015;42(5):533–44. https://doi.org/10.1007/s10488-013-0528-y.

  14. 14.

    Green PE, Srinivasan V. Conjoint analysis in marketing: new developments with implications for research and practice. J Mark. 1990;54(4):3–19. https://doi.org/10.2307/1251756.

  15. 15.

    Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43(3):337–50. https://doi.org/10.1016/j.amepre.2012.05.024.

  16. 16.

    Lewis CC, Weiner BJ, Stanick C, Fischer SM. Advancing implementation science through measure development and evaluation: a study protocol. Implement Sci. 2015;10:102. https://doi.org/10.1186/s13012-015-0287-0.

  17. 17.

    Fern EF. The use of focus groups for idea generation: the effects of group size, acquaintanceship, and moderator on response quantity and quality. J Mark Res. 1982;19(1):1–13. https://doi.org/10.2307/3151525.

  18. 18.

    Palinkas LA, Aarons GA, Horwitz S, Chamberlain P, Hurlbert M, Landsverk J. Mixed methods designs in implementation research. Admin Pol Ment Health. 2011;38 https://doi.org/10.1007/s10488-010-0314-z.

  19. 19.

    Waltz TJ, Powell BJ, Chinman MJ, Smith JL, Matthieu MM, Proctor EK, Damschroder LJ, et al. Expert Recommendations for Implementing Change (ERIC): protocol for a mixed methods study. Implement Sci. 2014;9(1):39. https://doi.org/10.1186/1748-5908-9-39.

  20. 20.

    Valente TW, Chou CP, Pentz MA. Community coalitions as a system: effects of network change on adoption of evidence-based substance abuse prevention. Am J Public Health. 2007;97(5):880–6. https://doi.org/10.2105/ajph.2005.063644.

  21. 21.

    Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: the Evidence-Based Practice Attitude Scale (EBPAS). Ment Health Serv Res. 2004;6(2):61–74. https://doi.org/10.1023/b:mhsr.0000024351.12294.65.

  22. 22.

    National Implementation Research Network (NIRN). National Implementation Research Network’s Active Implementation Hub. University of North Carolina at Chapel Hill’s FPG Child Development Institute. 2017. http://implementation.fpg.unc.edu/. Accessed 6 Jun 2017.

  23. 23.

    Higgins MC, Weiner J, Young L. Implementation teams: a new lever for organizational change. J Organ Behav. 2012;33(3):366–88. https://doi.org/10.1002/job.1773.

  24. 24.

    Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, Proctor EK, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implement Sci. 2015;10:109. https://doi.org/10.1186/s13012-015-0295-0.

  25. 25.

    Damschroder L, Aron D, Keith R, Kirsh S, Alexander J, Lowery J. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50. https://doi.org/10.1186/1748-5908-4-50.

  26. 26.

    Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8:117. https://doi.org/10.1186/1748-5908-8-117.

  27. 27.

    The Center for Research in Implementation Science and Prevention. Dissemination & Implementation Models in Health Research & Practice. no date. http://www.dissemination-implementation.org/index.aspx. Accessed 30 Apr 2017.

  28. 28.

    Boyd MR, Powell BJ, Endicott D, Lewis CC. A method for tracking implementation strategies: an exemplar implementing measurement-based care in community behavioral health clinics. Behav Ther. 2017; https://doi.org/10.1016/j.beth.2017.11.012.

  29. 29.

    Sawtooth Software. Conjoint Analysis. 2015. http://www.sawtoothsoftware.com/products/conjoint-choice-analysis

  30. 30.

    Weisz JR, Ng MY, Bearman SK. Odd couple? Reenvisioning the relation between science and practice in the dissemination-implementation era. Clin Psychol Sci. 2014;2(1):58–74. https://doi.org/10.1177/2167702613501307.

  31. 31.

    Granholm E, Holden JL, Sommerfeld D, Rufener C, Perivoliotis D, Mueser K, Aarons GA. Enhancing assertive community treatment with cognitive behavioral social skills training for schizophrenia: study protocol for a randomized controlled trial. Trials. 2015;16:438. https://doi.org/10.1186/s13063-015-0967-8.

  32. 32.

    Saleem JJ, Plew WR, Speir RC, Herout J, Wilck NR, Ryan DM, Cullen TA et al. Understanding barriers and facilitators to the use of Clinical Information Systems for intensive care units and Anesthesia Record Keeping: a rapid ethnography. Int J Med Inform. 2015;84(7):500–11. doi:https://doi.org/10.1016/j.ijmedinf.2015.03.006.

  33. 33.

    Lewis CC, Marriott BR, Scott K. A8: tailoring a cognitive behavioral therapy implementation protocol using mixed methods, conjoint analysis, and implementation teams. 3rd Biennial Conference of the Society for Implementation Research Collaboration (SIRC) 2015, vol. 2016. Seattle: Implementation Science. p. 85.

  34. 34.

    Aarons GA, McDonald EJ, Sheehan AK, Walrath-Greene CM. Confirmatory factor analysis of the Evidence-Based Practice Attitude Scale (EBPAS) in a geographically diverse sample of community mental health providers. Admin Pol Ment Health. 2007;34(5):465–9. https://doi.org/10.1007/s10488-007-0127-x.

  35. 35.

    Aarons GA, Sawitzky AC. Organizational culture and climate and mental health provider attitudes toward evidence-based practice. Psychol Serv. 2006;3(1):61–72. https://doi.org/10.1037/1541-1559.3.1.61.

  36. 36.

    Aarons GA. Transformational and transactional leadership: association with attitudes toward evidence-based practice. Psychiatr Serv. 2006;57(8):1162–9. https://doi.org/10.1176/appi.ps.57.8.1162.

  37. 37.

    Aarons GA, Glisson C, Hoagwood K, Kelleher K, Landsverk J, Cafri G. Psychometric properties and United States norms of the Evidence-Based Practice Attitude Scale (EBPAS). Psychol Assess. 2010;3 https://doi.org/10.1037/a0021103.

  38. 38.

    Jensen-Doss A, Hawley KM. Understanding barriers to evidence-based assessment: clinician attitudes toward standardized assessment tools. J Clin Child Adolesc Psychol. 2010;39(6):885–96. https://doi.org/10.1080/15374416.2010.517169.

  39. 39.

    Glaser SR, Zamanou S, Hacker K. Measuring and interpreting organizational culture. Manag Commun Q. 1987;1(2):173–98. https://doi.org/10.1177/0893318987001002003.

  40. 40.

    Glisson C, James LR. The cross-level effects of culture and climate in human service teams. J Organ Behav. 2002;23(6):767–94. https://doi.org/10.1002/job.162.

  41. 41.

    Zamanou S, Glaser SR. Moving toward participation and involvement: managing and measuring organizational culture. Group Organ Manag. 1994;19(4):475–502. https://doi.org/10.1177/1059601194194005.

  42. 42.

    Lehman WEK, Greener JM, Simpson DD. Assessing organizational readiness for change. J Subst Abus Treat. 2002;22(4):197–209. doi:https://doi.org/10.1016/S0740-5472(02)00233-7.

  43. 43.

    Broome KM, Flynn PM, Knight DK, Simpson DD. Program structure, staff perceptions, and client engagement in treatment. J Subst Abus Treat. 2007;33(2):149–58. https://doi.org/10.1016/j.jsat.2006.12.030.

  44. 44.

    Broome KM, Knight DK, Edwards JR, Flynn PM. Leadership, burnout, and job satisfaction in outpatient drug-free treatment programs. J Subst Abus Treat. 2009;37(2):160–70. https://doi.org/10.1016/j.jsat.2008.12.002.

  45. 45.

    Joe GW, Broome KM, Simpson DD, Rowan-Szal GA. Counselor perceptions of organizational factors and innovations training experiences. J Subst Abus Treat. 2007;33(2):171–82. https://doi.org/10.1016/j.jsat.2006.12.027.

  46. 46.

    Garner BR, Hunter BD, Godley SH, Godley MD. Training and retaining staff to competently deliver an evidence-based practice: the role of staff attributes and perceptions of organizational functioning. J Subst Abus Treat. 2012;42(2):191–200. https://doi.org/10.1016/j.jsat.2011.10.016.

  47. 47.

    Scott K, Dorsey C, Marriott B, Lewis CC. Psychometric validation of the Impact of Infrastructure Survey. Philadelphia: Association for Behavioral and Coginitive Therapies Annual Convention; 2014.

  48. 48.

    Valente TW, Ritt-Olson A, Stacy A, Unger JB, Okamoto J, Sussman S. Peer acceleration: effects of a social network tailored substance abuse prevention program among high-risk adolescents. Addiction. 2007;102(11):1804–15. https://doi.org/10.1111/j.1360-0443.2007.01992.x.

Download references

Acknowledgements

The authors would like to acknowledge the Society for Implementation Research Collaboration and their biennial conference series. We would like to acknowledge Dr. Robert Hindman for his intellectual contributions to the blueprint. We would also like to acknowledge Katrina Brock and Derrick McCree and the participants of Wolverine Human Services for being such active partners.

Funding

Wolverine Human Services is funding a 5-year study of the implementation of cognitive behavioral therapy into a youth residential treatment center, from October 1, 2013, to September 30, 2018.

Availability of data and materials

The datasets used and/or analyzed during the current study available from the corresponding author on reasonable request.

Author information

CCL initially conceptualized the project in collaboration with the Beck Institute and stakeholders at the residential treatment center. CCL is the project PI and co-lead trainer alongside Beck Institute Faculty. BRM was the project coordinator and oversaw the quantitative data collection, entry, cleaning, analysis, and interpretation. KS assisted with training and all components of the tailored implementation process, notably leading the qualitative analysis. CCL drafted the background and discussion of the manuscript. BRM and KS drafted major components of the method and results sections. All authors read and approved the final manuscript.

Correspondence to Cara C. Lewis.

Ethics declarations

Ethics approval and consent to participate

This study was approved as an expedited study (#1312941981) by the Indiana University Institutional Review Board.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional file

Additional file 1:

Details regarding the qualitative coder training and coding process. This file provides additional information regarding the training of qualitative coders and the process followed for coding qualitative data. (DOCX 14 kb)

Additional file 2:

Barriers and facilitators. This file includes details regarding the barriers and facilitators, including quantitative results and exemplar quotes from staff. (DOCX 19 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Keywords

  • Tailored implementation
  • Conjoint analysis
  • Mixed methods
  • Community partnership
  • Youth residential setting