Skip to main content

Non-participants in policy efforts to promote evidence-based practices in a large behavioral health system

Abstract

Background

System-wide training initiatives to support and implement evidence-based practices (EBPs) in behavioral health systems have become increasingly widespread. Understanding more about organizations who do not participate in EBP training initiatives is a critical piece of the dissemination and implementation puzzle if we endeavor to increase access in community settings.

Methods

We conducted 30 1-h semi-structured interviews with leaders in non-participating agencies who did not formally participate in system-wide training initiatives to implement EBPs in the City of Philadelphia, with the goal to understand why they did not participate.

Results

We found that despite not participating in training initiatives, most agencies were adopting (and self-financing) some EBP implementation. Leadership from agencies that were implementing EBPs reported relying on previously trained staff to implement EBPs and acknowledged a lack of emphasis on fidelity. Most leaders at agencies not adopting EBPs did not have a clear understanding of what EBP is. Those familiar with EBPs in agencies not adopting EBPs reported philosophical objections to EBPs. When asked about quality assurance and treatment selection, leaders reported being guided by system audits.

Conclusions

While it is highly encouraging that many agencies are adopting EBPs on their own, significant questions about fidelity and implementation success more broadly remain.

Peer Review reports

Background

Implementation of evidence-based practices (EBPs) has become an integral focus in behavioral health service delivery [13] due to increasing awareness that EBPs are not widely available in community behavioral health settings [46]. The implementation of EBPs could confer many advantages to the public sector including better therapeutic outcomes and improved cost-effectiveness compared with treatment as usual [7, 8]. The process of implementing EBPs in public behavioral health systems is complex and fraught with challenges; about half of implementation efforts fail [9, 10].

Policymakers in many large behavioral health systems including Philadelphia, Washington, New York, Hawaii, and Los Angeles County recently have invested significant resources in supporting EBP implementation [11, 12]. Some systems have legislatively mandated that organizations and providers use EBPs (e.g., Washington) or tied payment to the use of EBPs (e.g., Los Angeles). Other systems have supported the use of EBPs by offering EBP training but without mandating their use (e.g., Philadelphia). Understanding factors that lead to participation in voluntary initiatives can provide insight into what drives individuals and organizations to adopt EBPs when not required. Recent studies examine the perspective of stakeholders in large behavioral health systems who participate in EBP training initiatives [13, 14]. Little is known, however, about those that do not participate in such initiatives [15, 16].

For the purposes of this paper, non-participators are defined as those organizations who did not participate in formal system-led training initiatives. Non-participators may include both adopters and non-adopters of EBPs. Gaining a more granular understanding of non-participators can help us identify factors that can be targeted by tailored implementation strategies [17]. It is likely that those that did not participate in EBP training initiatives are a heterogeneous group, and different reasons may drive non-participation. We hypothesize several sub-categories of non-participants, including those (1) who deliberately do not adopt EBPs, (2) who adopt EBPs but without relying on system support, and (3) who are not aware of EBPs or the training opportunities provided. The purpose of the present study was to engage with leaders of organizations that have not formally participated in system-wide training efforts to implement EBP with the goal of understanding the perspectives and experiences of the leaders of these non-participating organizations.

Methods

Setting

In Philadelphia, over the past 8 years, the Department of Behavioral Health and Intellectual disAbility Services (DBHIDS) has supported the implementation of several EBPs in select mental health and substance abuse agencies across multiple levels of care. Specifically, DBHIDS has implemented the Beck Community Initiative (cognitive therapy [18, 19]), the Trauma Initiative (prolonged exposure, trauma-focused cognitive behavioral therapy [20]), and the Dialectical Behavioral Therapy Initiative. Agencies receive free training, supervision, and in the case of trauma-focused cognitive behavioral therapy, an enhanced reimbursement rate. In 2012, DBHIDS established the Evidence-Based Practice & Innovations Center (EPIC), a centralized infrastructure that supports implementation and sustainment of EBPs in the community [16]. Approximately 50 of the 200 agencies in the Community Behavioral Health (CBH) networkFootnote 1 have participated in one or more EBP training initiatives [16]. The 75% of agencies that are non-participants are the focus of the present study. These EBP training initiatives are part of a larger systemic transformational context that began in 2005, emphasizing and prioritizing the cultures, resilience, and strengths of consumer and their families in a recovery-oriented and trauma-informed framework [20, 21]. The process through which DBHIDS selects organizations for initiatives has evolved over time. Initially, selection was largely guided by DBHIDS (e.g., larger, excelling, and failing organizations were chosen). More recently, organizations have applied for participation through a competitive request-for-applications (RFA) process and are selected for participation by DBHIDS [16]. All procurement opportunities are systematically distributed through the CBH Executive Director listserv. Announcements regarding the EBP initiatives are also disseminated through the (optional) CBH-News and EPIC listservs, and are posted on the DBHIDS, CBH, and EPIC websites.

Participants and procedure

To obtain a sample of EBP non-participating agencies (NPAs) in this system, we included in our sample all mental health and drug and alcohol treatment agencies located in Philadelphia county who (1) serve more than 100 clients yearly, (2) had not participated in one or more of the DBHIDS initiatives, and (3) had no other known EBP activity (e.g., participation in non-DBHIDS (external but formalized grant funded) EBP trainings, initiatives or implementations), resulting in an eligible sample of 51 agencies. An additional four agencies were eliminated because they closed (2) or were acquired by another agency who was an initiative participant (2).

We contacted agency leadership of each of these 47 agencies via e-mail to ascertain interest. We focused on leaders because we wanted to speak with the person at the agency who would be best acquainted with the selection, operations, implementation, and management of current and new practices. Three agencies declined participation and 11 agencies did not respond to repeated recruitment efforts. Of the 33 agencies who agreed to participate, 30 agencies were interviewed (response rate = 64%; the remaining three agencies were not responsive to scheduling requests after initially agreeing to participate). Interviews were conducted in-person by the first author between May 2015 and January 2016. Participants gave written consent and were compensated $150 for participation.

Qualitative interview

We developed a semi-structured interview guide consisting of three parts. The first set of questions was exploratory and system-specific: we asked if leadership had awareness or knowledge of the system-sponsored EBP training offerings and if the agency had applied to participate in these initiatives (e.g., What has your agency’s exposure been to the DBHIDS EBP initiatives? Has your agency taken part in the DBHIDS EBP initiatives?). In the second part of the interview, we asked about general attitudes towards EBP and if and how the agency had engaged in or implemented any EBP (e.g., Tell me what you think of when you hear the term EBP? Tell us about any experiences with EBP to date in your agency.). Third, we asked questions to understand more about the agency’s overall treatment philosophy, quality assurance practices, treatment approaches, and population (e.g., What is the agency treatment philosophy or mission? How do you ensure quality of your practices/therapists?).

Data analysis

Grounded theory informed our overall data analytic strategy given its exploratory nature [22]. An iterative process was used to analyze and develop a codebook for the interview transcripts. The investigators developed a set of codes through a close reading of five transcripts that was applied to the data using an inductive approach [23]. The coders then coded four more transcripts separately and met to adjudicate differences, create additional codes, develop coding rules, consolidate redundant concepts, and finalize the codebook. Coders used NVIVO qualitative data analysis software. Once coding was complete, a random 20% of transcripts were coded by two investigators and the inter-rater reliability was found to be excellent (κ = .98; [24]). The first author read through the coded nodes and produced memos including examples and commentary regarding emergent themes within the nodes. Finally, the first and last authors conducted consensus coding. The interview and codebook are available by request from the first author.

A central question to this research was whether the NPAs were adopters or non-adopters of EBP. We coded an agency as an “adopter agency” if the leadership interviewed described his or her agency or at least one therapist as engaging in any practice listed as an EBP on the Substance Abuse and Mental Health Services Administration National Registry of Evidence-based Programs and Practices [25]. Agencies in which leaders did not endorse implementing any EBP reported in this National Registry were coded as “non-adopters.” Based on the transcripts, the first author sorted the 30 agencies into the two categories, and the last author categorized a randomly selected subset (20%) with 100% reliability to the first author.

Results

Participants

Characteristics of the leaders and the agencies they represented are denoted in Table 1, column 1. Leaders included 24 executive directors/CEOs and six clinical directors.

Table 1 Characteristics of non-participant agencies and leaders

Knowledge of EBP initiatives

The first question of our investigation is if the NPAs held knowledge about the initiatives. Most NPAs (60%) were unaware of the DBHIDS EBP training initiatives (“never heard of them”) and expressed interest in hearing more information. Of the agencies who knew about the EBP initiative opportunities, 67% had not applied to take part in initiatives due to time constraints associated with participating in the initiative and/or completing the application, or disinterest in the training opportunity. The remainder (13%) had applied to the initiatives and were rejected or in the midst of the application process.

Adopter/non-adopter categorization

A second question of this investigation is if NPAs were adopting EBPs despite their non-participation in the system-led training initiatives. The results of our EBP adopter categorization are as follows: Of the 30 NPAs, we classified 20 (67%) as EBP adopter agencies and 10 (33%) as EBP non-adopter agencies. Adopters and non-adopters were equally aware of the EBP initiatives, and a significant minority of adopters had applied or were applying to the initiatives. Fisher’s exact tests revealed no differences on the demographics of adopters and non-adopter leaders or agency characteristics (All ps > .30; see Table 1).

Adopter agencies

A list of interventions named by the adopter agencies can be found in Table 2. Adopter agencies represented agencies who were adopting EBP outside of system-led training initiatives. Adopter agencies reported two major themes including a reliance on the prior training and qualifications of their staff in order to practice EBP and a lack of emphasis on fidelity.

Table 2 List of treatments reported by adopter agencies

Competence of staff

Leaders from adopter agencies asserted that they hired only qualified therapists (“professional staff”) with “EBP-oriented attitudes” who acquired training in graduate school. As one supervisor noted, “We only hire masters-level clinicians and I find that that’s (EBPs) what their training has been.” Agency directors acknowledged paying for some training (internal, external, or online) but primarily relying on “teach-backs” to train the rest of the staff. As one described: “We can’t directly have someone like [the University of Pennsylvania] train contractors. That’s impossible, but it’s sort of the Jesus model. Jesus and his apostles train the trainers who all very much commit and go out and proselytize.” Leadership from adopting agencies also noted that securing outside grants to finance trainings was important as was culture change.

Fidelity

Most agency leaders reported that fidelity to EBP was not prioritized in their agencies: “We don’t get crazy about fidelity.” Fidelity was often viewed as not feasible in community behavioral health. As one clinical director described: “I try to teach these rigid [cognitive therapy] protocols to my therapists and they say, ‘This won’t fly with my guys.’”

Non-adopter agencies

Most leaders (60%) at non-adopter agencies required clarification on the definition of EBPs. The remainder were familiar with EBP but had clear philosophical objections.

Confusion about EBP

Many equated EBPs with outcomes. These administrators tended to view EBPs simply as a means to an end (i.e., collecting outcomes) rather than a manualized set of clinical practices that have been found to have robust patient outcomes in randomized controlled trials. For example, one leader said, “When I think about EBPs I think of tangible criteria that will measure where a person started, where at midpoint, and where they finally end.”

Philosophical objections to EBP

The remaining leaders of non-adopter agencies demonstrated a clear understanding of EBPs but reported philosophical misgivings about EBPs and objections about the practices themselves and concerns about who is promoting these practices and their fit to community populations. One non-adopting CEO epitomized such misgivings when he noted: “I think it’s unethical for CBH, which is after all an insurance company, to be pushing and promoting and putting words in therapists’ mouths about what they should do.”

Comparisons between non-adopter and adopter agencies

We conducted post hoc thematic comparisons between adopter agencies and non-adopter agencies to identify if thematic differences emerged between the two groups. Differences between the two groups included attitudes about intervention-population fit, treatment approaches, policy-directed care, and quality assurance.

Attitudes about fit with population

Leaders from non-adopter agencies were more likely than leaders from adopter agencies to express concerns that EBPs (or manualized treatments) were not applicable to their work in community behavioral health centers, specifically their complex and comorbid client populations: “It [EBP] doesn’t fit the needs of the people we are serving.” Also prominent among non-adopter agency leadership were beliefs that EBPs should be blended with the “art” of therapy and real-life experiences, or that EBPs often do not apply holistically: “Sometimes I have to leave a manual or the theory and work with my gut because the clients I had before me were real clients with real need and need help and I can’t just stick with the therapy.”

In contrast, leaders from adopter agencies were more likely to believe that EBPs fit well with their population. As one remarked: “The [cognitive-behavior therapy] treatment approach seems to be…direct for our population. Some of our population, their average is about 5th or 6th grade reading level. The gentlemen and women that we work with—seem to understand it well.”

Treatment modalities

Leaders from non-adopting agencies reported using primarily spirituality and faith-based modalities. As one executive described, “We do the work of walking alongside hurting and broken people and we see that as sacred.” In contrast, leadership from adopter agencies endorsed cognitive behavioral strategies and client-centered and eclectic approaches. As one clinical director explained, “It’s not so unstructured that we look dynamic—as much as I hate to use this term, it’s more eclectic cognitive therapy.”

Quality assurance

When queried about treatment selection and overall quality assurance practices at their agencies, a majority of leaders from adopting agencies identified internal mechanisms, such as staff meetings, supervision, and internal and audits as quality assurance practices at their agencies. One director noted, “We firmly believe that in order to provide a service you need to have a highly trained and highly supervised staff who is receiving direct feedback in order to make sure we have treatment integrity.” Non-adopter agency leadership primarily identified external system-level audits as a way to ensure quality: “And really, the audits that CBH has would check on those things [quality].”

Discussion

The present study is the first examination of community behavioral health agencies that do not participate in system-sponsored EBP training initiatives. This research is relevant and timely given the increasing number of public health settings in which EBPs are encouraged, often with associated resources such as free training and supervision, or an enhanced reimbursement rate. The results from our study indicate that many leaders from NPAs were unaware of the EBP initiative offerings, indicating lost opportunity and that policymakers may need to do more effective outreach to inform all providers of opportunities. Despite not participating in the formal EBP efforts led by the system, many of the NPAs in the Philadelphia system are implementing EBPs on their own accord with their own resources. Despite this promising and somewhat unexpected finding given the fiscal challenges in community behavioral health [13, 26], leaders from adopter agencies unapologetically acknowledged that they did not prioritize rigid fidelity during the implementation process. Maintaining fidelity is resource-intensive, and it is not surprising that other demands in a community behavioral health agency take precedence. Although some research suggests a high level of adherence may not be necessary for successful treatment outcomes (e.g., [27]), the efficacy (as well as acceptability and sustainability) of treatment implementation in the absence of fidelity checks remains an important empirical question.

This lack of focus on rigid fidelity may be best explained by the manner through which adopter agencies gained expertise in EBPs. Leaders from most adopting agencies reported that their agencies were not participating in any formal training, supervision, or consultation efforts, and relied instead on previously trained staff, 1-day workshops, and teach-backs. Although an updated survey is warranted, past surveys of training programs in many disciplines suggest that training programs fail to provide even a minimum level of EBP training [2830]. It may be problematic to rely on the prior training of an incoming workforce. More research is needed on the efficacy of these training efforts compared with proven effective multi-component training models [3133]. Nonetheless, adopting agencies must be lauded for implementing complex practices such as EBPs without any systemic support.

Leadership from a number of non-adopting organizations reported not implementing EBP because they view EBPs as incompatible with their agency’s clinical practice. Almost every implementation science framework describes the importance of the fit between the intervention and the population in which it will be implemented [34, 35], and stakeholders consistently raise it as a challenge to the adoption and use of EBP [3641]. Interestingly, in our prior work with stakeholders in Philadelphia who participated in the EBP training initiatives, we found that leadership did not raise intervention/client fit characteristics as a barrier to implementation and were more likely to label the fit between intervention and client as a facilitator [13]. In addition, leadership from adopting organizations in the current study reported that EBPs fit well with their population. Whether adopters and non-adopter agencies serve different clientele remains an empirical question. It is also possible that implementing an EBP may change attitudes about intervention/client fit. Nonetheless, this fundamental difference between adopters and non-adopters in their belief that a particular EBP or EBPs in general is compatible with their clients suggests an important lever for intervention. Continued work is needed with non-adopters to understand these beliefs and perhaps to shift the standard manual or training protocol to fit the realities and needs of front-line service providers and administrators. Although vivid case presentations of research-based treatments hold some persuasive value with clinicians [42], more empirical research is needed on whether focused EBP promotional efforts can change attitudes or translate to training interest at the leadership level. In addition, several non-adopters were unclear about the basic fundamentals defining EBP, indicating different promotional strategies are needed across NPAs.

Although leadership from adopter agencies endorsed the use of internal quality assurance mechanisms, leaders from both adopter and non-adopter agencies saw payer audits and documentation requirements as an acceptable form of quality assurance. This promising finding suggests that agency leadership may find audits helpful towards their goal of maintaining and enhancing quality. Regulators and policymakers therefore have an opportunity to design their audits with the potential to regulate the use of EBP [4345]. If audits can at least partially achieve the goal of driving clinical practice and quality, perhaps it is time to re-visit the oft-stated axiom that “mandates do not work” to improve EBP implementation. Similar to prior findings [13], our results suggest that clear messaging and prioritizing as part of these mandates is essential for them to achieve their desired effects. More research is also needed on the most effective strategies and channels to educate about and promote EBPs to stakeholders.

Several study limitations should be mentioned. First, our results are based on interview self-report data, and self-presentation bias may be present on behalf of the individual or the organization. Most indicators suggest that administrators were forthright—in fact many seemed eager to share frank opinions. Although our raters achieved perfect reliability, our classification of agencies as adopter or non-adopter agencies was not confirmed with observation or chart review. Recent research cautions that therapist report of the services they deliver may be inaccurate, overestimating the amount of EBP present [46]. An additional limitation of our classification is that agency leaders were not specifically queried on the NREPP registry but probed regarding general practices; we may have missed report of some EBP as a result. This study only included the perspective of agency leadership; it would be interesting to also include the perspective of other stakeholders including therapists and consumers to understand if their perspectives diverge [2, 13]. Our qualitative sample was small, limiting statistical comparisons we can make between characteristics of adopter and non-adopter leaders and agencies. Lastly, our findings are specific to one behavioral health system and factors specific to this system may limit generalizability of the findings.

Conclusions

The current study lends support to the idea that EBP is diffusing beyond those who are reached through formal initiatives. Despite this promising finding, our results suggest less than optimal training and supervisory conditions with little attention to fidelity, which may lead to compromised outcomes [47]. It is still impressive, however, to note that these agencies have chosen to implement EBP using their own resources. Unlike the cost of undertaking entire initiatives, it may be cost-effective for policymakers to intervene with specific strategies to enhance training, supervision, and fidelity in these “low-hanging fruit” adopting agencies. It is also possible that self-resourcing adopting agencies may endow leadership with more buy-in as compared to those who participate in EBP training initiatives, another empirical question for future research. The present research suggests the need for alternative strategies to educate about and enhance EBP more widely within a system (including consumers), beyond additional training initiatives.

Notes

  1. Community Behavioral Health (CBH) is a not-for-profit 501(c)(3) corporation contracted by the City of Philadelphia for provision of behavioral health coverage for the City’s over 500,000 Medicaid-enrolled individuals.

Abbreviations

CBH:

Community Behavioral Health

DBHIDS:

Department of Behavioral Health and Intellectual disAbility Services

EBPs:

Evidence-based practices

NPA:

Non-participating agency

References

  1. Aarons GA, Sawitzky AC. Organizational culture and climate and mental health provider attitudes toward evidence-based practice. Psychol Serv. 2006;3:61–72.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Aarons GA, Wells RS, Zagursky K, Fettes DL, Palinkas LA. Implementing evidence-based practice in community mental health agencies: a multiple stakeholder analysis. Am J Public Health. 2009;99:2087–95.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: the evidence-based practice attitude scale (EBPAS). Ment Health Serv Res. 2004;6:61–74.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Ganju V. Implementation of evidence-based practices in state mental health systems: implications for research and effectiveness studies. Schizophr Bull. 2003;29:125–31.

    Article  PubMed  Google Scholar 

  5. Gotham HJ. Advancing the implementation of evidence-based practices into clinical practice: how do we get there from here? Prof Psychol Res Pract. 2006;37:606–13.

    Article  Google Scholar 

  6. Institute of Medicine. Crossing the quality chasm: a new health system for the 21st century [Internet]. Washington, D.C.: National Academies Press; 2001 [cited 2016 Aug 29]. Available from: http://www.nap.edu/catalog/10027

  7. Cristofalo MA. Implementation of health and mental health evidence-based practices in safety net settings. Soc Work Health Care. 2013;52:728–40.

    Article  PubMed  Google Scholar 

  8. Hemmelgarn AL, Glisson C, James LR. Organizational culture and climate: implications for services and interventions research. Clin Psychol Sci Pract. 2006;13:73–89.

    Article  Google Scholar 

  9. Bond GR, Drake RE, McHugo GJ, Peterson AE, Jones AM, Williams J. Long-term sustainability of evidence-based practices in community mental health agencies. Adm Policy Ment Health. 2014;41:228–36.

    Article  PubMed  Google Scholar 

  10. Klein KJ, Knight AP. Innovation implementation overcoming the challenge. Curr Dir Psychol Sci. 2005;14:243–6.

    Article  Google Scholar 

  11. Raghavan R, Bright CL, Shadoin AL. Toward a policy ecology of implementation of evidence-based practices in public mental health settings. Implement Sci IS. 2008;3:26.

    Article  PubMed  Google Scholar 

  12. Rubin RM, Hurford MO, Hadley T, Matlin S, Weaver S, Evans AC. Synchronizing watches: the challenge of aligning implementation science and public systems. Adm. Policy Ment. Health Ment. Health Serv. Res. [Internet]. 2016 [cited 2016 Sep 6]; Available from: http://link.springer.com/10.1007/s10488-016-0759-9

  13. Beidas RS, Stewart RE, Adams DR, Fernandez T, Lustbader S, Powell BJ, et al. A Multi-Level Examination of Stakeholder Perspectives of Implementation of Evidence-Based Practices in a Large Urban Publicly-Funded Mental Health System. Adm Policy Ment Health. 2016;43:893–908.

    Article  PubMed  Google Scholar 

  14. Olin SS, Nadeem E, Gleacher A, Weaver J, Weiss D, Hoagwood KE, et al. What predicts clinician dropout from state-sponsored managing and adapting practice training. Adm. Policy Ment. Health Ment. Health Serv. Res. [Internet]. 2015 [cited 2016 Oct 3]; Available from: http://link.springer.com/10.1007/s10488-015-0709-y.

  15. Skriner LC, Wolk CB, Stewart RE, Adams DR, Rubin RM, Evans AC, et al. Therapist and Organizational Factors Associated with Participation in Evidence-Based Practice Initiatives in a Large Urban Publicly-Funded Mental Health System. J Behav Health Serv Res. [Internet]. 2017 [cited 2017 May 4]; Available from: http://link.springer.com/10.1007/s11414-017-9552-0.

  16. Powell BJ, Beidas RS, Rubin RM, Stewart RE, Wolk CB, Matlin SL, et al. Applying the Policy Ecology Framework to Philadelphia’s Behavioral Health Transformation Efforts. Adm Policy Ment Health. 2016;43:909–26.

    Article  PubMed  Google Scholar 

  17. Powell BJ, Proctor EK, Glass JE. A systematic review of strategies for implementing empirically supported mental health interventions. Res Soc Work Pract. 2014;24:192–212.

    Article  PubMed  Google Scholar 

  18. Creed TA, Stirman SW, Evans AC, Beck AT. A model for implementation of cognitive therapy in community mental health: the Beck initiative. Behav Ther. 2010;37:56–65.

    Google Scholar 

  19. Stirman SW, Spokas M, Creed TA, Farabaugh DT, Bhar SS, Brown GK, et al. Training and consultation in evidence-based psychosocial treatments in public mental health settings: the ACCESS model. Prof Psychol Res Pract. 2010;41:48–56.

    Article  Google Scholar 

  20. Beidas RS, Adams DR, Kratz HE, Jackson K, Berkowitz S, Zinny A, et al. Lessons learned while building a trauma-informed public behavioral health system in the City of Philadelphia. Eval Program Plann. 2016;59:21–32.

    Article  PubMed  Google Scholar 

  21. Achara-Abrahams I, Evans AC, King JK. Recovery-focused behavioral health systems transformation: a framework for change and lessons learned from Philadelphia. Addict Recovery Manag Theory Sci Pract. 2011;187–208.

  22. Glaser B, Strauss A. The discovery of grounded theory: strategies for qualitative research. New Brunswick u.a: Aldine Transaction; 1999.

    Google Scholar 

  23. Bradley EH, Curry LA, Devers KJ. Qualitative data analysis for health services research: developing taxonomy, themes, and theory. Health Serv Res. 2007;42:1758–72.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977;33:159–74.

    Article  CAS  PubMed  Google Scholar 

  25. Substance abuse and mental health services administration national registry of evidence-based programs and practices [Internet]. [cited 2015 May 17]. Available from: https://www.samhsa.gov/nrepp

  26. Stewart RE, Adams DR, Mandell DS, Hadley TR, Evans AC, Rubin R, et al. The perfect storm: collision of the business of mental health and the implementation of evidence-based practices. Psychiatr Serv Wash DC. 2016;67:159–61.

    Article  Google Scholar 

  27. Webb CA, DeRubeis RJ, Barber JP. Therapist adherence/competence and treatment outcome: a meta-analytic review. J Consult Clin Psychol. 2010;78:200–11.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Crits-Christoph P, Frank E, Chambless DL, Brody C, Karp JF. Training in empirically validated treatments: what are clinical psychology students learning? Prof Psychol Res Pract. 1995;26:514–22.

    Article  Google Scholar 

  29. Hoge MA, Tondora J, Marrelli AF. The fundamentals of workforce competency: implications for behavioral health. Adm Policy Ment Health Ment Health Serv Res. 2005;32:509–31.

    Article  Google Scholar 

  30. Weissman MM, Verdeli H, Gameroff MJ, Bledsoe SE, Betts K, Mufson L, et al. National survey of psychotherapy training in psychiatry, psychology, and social work. Arch Gen Psychiatry. 2006;63:925.

    Article  PubMed  Google Scholar 

  31. Herschell AD, Kolko DJ, Baumann BL, Davis AC. The role of therapist training in the implementation of psychosocial treatments: a review and critique with recommendations. Clin Psychol Rev. 2010;30:448–66.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Nakamura BJ, Selbo-Bruns A, Okamura K, Chang J, Slavin L, Shimabukuro S. Developing a systematic evaluation approach for training programs within a train-the-trainer model for youth cognitive behavior therapy. Behav Res Ther. 2014;53:10–9.

    Article  PubMed  Google Scholar 

  33. Sholomskas D, Syracuse-Siewert G, Rousanville B, Ball S, Nuro K, Carroll K. We don’t train in vain: a dissemination trial of three strategies of training clinicians in cognitive behavioral therapy. J Consult Clin Psychol. 2005;73:106–15.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38:4–23.

    Article  PubMed  Google Scholar 

  35. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci IS. 2009;4:50.

    Article  PubMed  Google Scholar 

  36. Nelson TD, Steele RG. Predictors of practitioner self-reported use of evidence-based practices: practitioner training, clinical setting, and attitudes toward research. Adm Policy Ment Health Ment Health Serv Res. 2007;344:319–30.

    Article  Google Scholar 

  37. Nelson TD, Steele RG, Mize JA. Practitioner attitudes toward evidence-based practice: themes and challenges. Adm Policy Ment Health Ment Health Serv Res. 2006;33:398–409.

    Article  Google Scholar 

  38. Pagoto SL, Spring B, Coups EJ, Mulvaney S, Coutu M-F, Ozakinci G. Barriers and facilitators of evidence-based practice perceived by behavioral science health professionals. J Clin Psychol. 2007;63:695–705.

    Article  PubMed  Google Scholar 

  39. Riley WT, Schumann MF, Forman-Hoffman VL, Mihm P, Applegate BW, Asif O. Responses of practicing psychologists to a web site developed to promote empirically supported treatments. Prof Psychol Res Pract. 2007;38:44–53.

    Article  Google Scholar 

  40. Gallo KP, Barlow DH. Factors involved in clinician adoption and nonadoption of evidence-based interventions in mental health. Clin Psychol Sci Pract. 2012;19:93–106.

    Article  Google Scholar 

  41. Stewart RE, Stirman SW, Chambless DL. A Qualitative Investigation of Practicing Psychologists’ Attitudes Toward Research-Informed Practice: Implications for Dissemination Strategies. Prof Psychol Res Pract. 2012;43:100–9.

    Article  Google Scholar 

  42. Stewart RE, Chambless DL. Interesting practitioners in training in empirically supported treatments: research reviews versus case studies. J Clin Psychol. 2010;66:73–95.

    PubMed  PubMed Central  Google Scholar 

  43. Mandell DS, Barry CL, Marcus SC, Xie M, Shea K, Mullan K, et al. Effects of autism spectrum disorder insurance mandates on the treated prevalence of autism spectrum disorder. JAMA Pediatr. 2016;170:887–93.

    Article  PubMed  Google Scholar 

  44. Rieckmann T, Bergmann L, Rasplica C. Legislating clinical practice: counselor responses to an evidence-based practice mandate. J. Psychoactive Drugs. 2011;Suppl 7:27–39.

  45. Ziegenfuss JT, Hadley T. Understanding purposes of regulation: a case example in mental health. Adm Soc Work. 1980;4:53–60.

    Article  PubMed  Google Scholar 

  46. Creed TA, Wolk CB, Feinberg B, Evans AC, Beck AT. Beyond the label: relationship between community therapists’ self-report of a cognitive behavioral therapy orientation and observed skills. Adm Policy Ment Health. 2016;43:36–43.

    Article  PubMed  Google Scholar 

  47. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement. Sci. [Internet]. 2013 [cited 2016 Sep 7];8. Available from: http://implementationscience.biomedcentral.com/articles/10.1186/1748-5908-8-117

Download references

Acknowledgements

We are especially grateful for the support that the Department of Behavioral Health and Intellectual disAbility Services has provided for this project, and for the Evidence Based Practice and Innovation (EPIC) group. We would also like to thank all of the agency administrators who participated in the study, making it possible.

Funding

The UPenn Implementation Science Working Group Pilot Fund (Stewart), NIMH F32 MH103960 (Stewart), and NIMH K23 MH099179 (Beidas).

Availability of data and materials

Interview, codebook, transcripts, and analyses are available from the corresponding author.

Authors’ contributions

All authors substantially contributed to the conception, design, and analysis of the work and provided the final approval of the version to be published.

Competing interests

The authors declare that they have no competing interests.

Consent for publication

Not applicable.

Ethics approval and consent to participate

The Institutional Review Boards of the University of Pennsylvania and the City of Philadelphia approved all study procedures and all ethical guidelines were followed.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rebecca E. Stewart.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Stewart, R.E., Adams, D.R., Mandell, D.S. et al. Non-participants in policy efforts to promote evidence-based practices in a large behavioral health system. Implementation Sci 12, 70 (2017). https://doi.org/10.1186/s13012-017-0598-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-017-0598-4

Keywords