Skip to content

Advertisement

Implementation Science

Open Access
Open Peer Review

This article has Open Peer Review reports available.

How does Open Peer Review work?

The 4KEEPS study: identifying predictors of sustainment of multiple practices fiscally mandated in children’s mental health services

Implementation Science201611:31

https://doi.org/10.1186/s13012-016-0388-4

Received: 23 October 2015

Accepted: 23 February 2016

Published: 9 March 2016

Abstract

Background

Research to date has largely focused on predictors of adoption and initial implementation of evidence-based practices (EBPs), yet sustained implementation is crucial to deliver a return on investments in dissemination. Furthermore, most studies focus on single EBPs, limiting opportunities to study the fit between practice characteristics EBPs and implementation contexts.

Methods/design

This observational study will characterize implementation sustainment and identify organizational and therapist characteristics that predict sustainment of multiple practices being implemented within a fiscal mandate in the largest public mental health system in the USA. Specific aims are to (1) characterize sustainment outcomes (volume/penetration, EBP concordant care); (2) use mixed methods to characterize inner context (agency- and therapist-level) factors and early implementation conditions; and (3) identify inner context factors and early implementation conditions that predict sustainment outcomes. This study will undertake original data collection and analysis of existing data sources to achieve its aims. Archived reports and documents will be used to characterize early implementation conditions in 102 agencies. Administrative claims data will be used to characterize volume and penetration outcomes over 8 years. Therapist and program manager surveys will be administered to characterize sustained EBP concordant care and inner context determinants of sustainment. An in-depth study in a subset of agencies will yield interview data and recordings of treatment sessions for validation of the EBP concordant care scale.

Discussion

This project will yield new understanding of whether and how multiple EBPs can be sustained in public mental health systems undergoing a policy-driven community implementation effort. We will produce generalizable models for characterizing sustainment, including feasible and flexible measurement of practice across multiple EBPs. The findings will inform the development of implementation interventions to promote sustained delivery of EBPs to maximize their public health impact.

Keywords

Children’s mental healthFiscally driven implementation

Background

Documented quality gaps between usual care (UC) and evidence-based practices (EBPs) [1, 2] have prompted large-scale implementation efforts in mental health (MH) systems [37]. Mandating EBPs in public managed care in children’s MH services began with the reform of services in Hawaii’s Department of Health following a 1999 consent decree [8]. By 2008, 90 % of state MH authorities reported strategies to install EBPs; 12 states had mandated the use of EBPs in public MH systems, with 8 states promoting, supporting, or requiring specific practices statewide [9]. These costly efforts provide natural laboratories to identify determinants of the sustainment of EBPs, an understudied topic in implementation science [10]. The Knowledge Exchange on Evidence-based Practice Sustainment (4KEEPS) study (R01 MH100134; MPIs Lau and Brookman-Frazee) is designed to understand the extent to which such investments result in sustained reach and use of multiple EBPs and identify determinants of sustainment that can be leveraged in novel implementation interventions. With few exceptions, most studies have focused on implementation outcomes of a single intervention. Yet MH systems are unlikely to implement a single intervention as patient needs vary [3, 11, 12]. Little is known about how sustainment outcomes and determinants of these outcomes vary by practice characteristics.

Context of current study (see Additional file 1: History of Developments leading to the LACDMH PEI Implementation)

The Los Angeles County Department of Mental Health (LACDMH) is the nation’s largest county MH department, serving, on average, more than 250,000 county residents of all ages every year [13]. In 2010, LACDMH launched the Prevention and Early Intervention transformation with training and implementation support for an initial set of six evidence-based/informed practices (hereafter referred to as practices) to address a range of prevalent youth MH problems, including Cognitive Behavioral Interventions for Trauma in Schools (CBITS), Child-Parent Psychotherapy (CPP), Managing and Adapting Practices (MAP), Seeking Safety (SS), Trauma-Focused Cognitive Behavior Therapy (TF-CBT), and Triple P Positive Parenting Program (Triple P). Therapist trainings commenced in May, 2010, and in fiscal year 2010–2011, over 32,000 children and transition age youth were served in Prevention and Early Intervention (PEI) programs [14]. The timing of the current study relative to the maturity of PEI transformation permits examination of practice sustainment up to 8 years after adoption.

We used the Exploration, Preparation, Implementation, and Sustainment (EPIS) framework [15] to frame the PEI transformation timeline and variables examined in the current study (see online resources Fig. 1). The EPIS model outlines four phases of implementation and highlights potential predictors of outcomes in the outer (e.g., funding, policy, organizational networks) and inner (e.g., organizational, leadership, therapist characteristics) contexts. Given the nature of the PEI transformation, “outer context” variables are shared across agencies and therapists, permitting us to isolate key aspects of the inner context (organizational and therapist variables) central in socio-technical models of implementation [16].
Fig. 1

Applying Aarons et al.’s [15] EPIS Framework to the LACDMH Prevention and Early Intervention (PEI) timeline

Focus on sustainment

Sustainment is defined as the extent to which a newly implemented practice is maintained or institutionalized within a service setting’s ongoing, stable operations [17]. Stirman et al. [18] reviewed 125 studies examining sustainment across service settings and interventions and highlighted a general lack of methodological rigor. Recommendations included developing measures of multiple sustainment outcomes, assessing sustainment over years rather than at a single time point, and developing methods to measure intervention adaptation that may occur to achieve sustainment. Thus, 4KEEPS fills important gaps in the literature through long-term post-adoption follow-up, characterizing and understanding the impact of naturalistic adaptations of practices, and examining multiple indicators of sustainment of multiple practices within a “marketplace” of implementation.

It is essential to consider multiple sustainment outcomes jointly. One such sustainment outcome is penetration, defined as the integration of a practice within a service setting. It can be measured as the number of eligible individuals who use/deliver a treatment, divided by the number of eligible individuals [17]. An important limitation of indexing a penetration ratio alone is that it does not track the absolute scale of an implementation effort over time. Moreover, the denominators used to calculate penetration rates vary over time with changes in the workforce and system policies. Therefore, in addition to penetration, we assess volume of practices delivered over time in raw units of numbers of agencies, therapists, children, and units of service to characterize the scale of implementation impact.

Use of system administrative data provides one method of measuring penetration and volume and aligns with calls for use of these data to understand practice patterns and inform implementation efforts [19, 20]. However, by itself, penetration/volume of a practice within a workforce is insufficient to determine the maintained success of the implementation. Continued delivery with poor integrity to the practice model is unlikely to produce the expected value of the practice [21]. Likewise, sustained practice integrity has little impact if few therapists persist in implementation. Thus, indicators of practice penetration/volume and practice integrity are essential in the study of sustainment.

Inner context determinants of sustainment outcomes

Within the PEI transformation, outer contextual factors are held relatively constant across agencies, as the fiscal mandate, contract conditions, reimbursement policies, and revenue stream apply uniformly to all agencies. As such, we will focus on identifying determinants of sustainment within the inner context.

Organizational factors

Over 100 agencies are contracted with LACDMH to deliver at least one of the six practices of interest. These agencies vary widely in size, structure, and resources to support implementation which creates a unique opportunity to examine agency-level determinants of practice sustainment. Although research indicates that organization and therapist-level factors influence attitudes toward and adoption of EBPs [2224], little research links these factors to sustainment. Organizational support for practices predicts adoption and initial implementation [23, 25], and organizational climate is associated with provider willingness to deliver practices [24, 26]. Specifically, leadership support, quality assurance structures, and staff incentives [27] predict positive therapist attitudes [28, 29] as well as workforce outcomes including lower turnover [23, 30]. Beyond early implementation phases, 4KEEPS will apply mixed methods to examine these associations with long-term sustained implementation of multiple practices at the therapist and agency levels.

Therapist factors

Since 2010, approximately 8500 therapists have delivered at least one of the six practices of interest. Therapist background factors have been shown to influence both attitudes toward EBP [22], and therapist attitudes have been linked to practice behavior [31]. A survey conducted soon after the PEI transformation revealed that significant variance in therapist attitudes were explained by practice type and that practice-specific ratings of intervention appeal predicted concurrent therapist reports of EBP use [32]. 4KEEPS will prospectively examine relations between dimensions of therapist attitudes and sustained use of practices over time as indexed by administrative, self-report, and observational data.

There is growing attention to the role of therapist adaptations to practices and their implications for implementation outcomes [18, 33, 34]. Although it is often implicitly expected that practices be delivered with efficacy trial adherence standards, this is at odds with contingencies in UC settings [35]. Therapists commonly raise the concern that practices may not fit the needs of clients seen in UC and must be adapted, particularly for ethnic minority groups not represented in controlled trials [3639]. Therapists may adapt by reframing interventions, incorporating techniques or components to promote acceptance or address group-specific needs [40]. Such investments in tailoring practices may increase commitment to delivering the EBP long term [41, 42]. On the other hand, inappropriate modifications or omissions represent drift [34]. It is unclear the extent to which community therapists engage in adaptations and whether they are consistent or inconsistent with practice integrity.

The overall objectives of 4KEEPS project are to characterize sustainment outcomes and identify inner context factors that predict the sustained penetration and use of multiple practices delivered through a fiscal mandate in children’s MH services. Supplemental aims address client-level determinants of implementation and potential racial disparities (see Additional file 2: 4KEEPS Disparities Supplement Aims).

Aim One: Characterize sustainment outcomes of six practices within the PEI transformation

EBP concordant care

Fidelity refers to the degree to which an intervention is delivered as intended by the program developers [43]. Fidelity to EBP is considered a central implementation outcome [17]. However, traditional fidelity measures may not be appropriate in the current study context. Assessments of therapist practice must be both effective and efficient to track the success of implementation efforts large enough to impact public health [21, 44, 45]. Direct observation, the gold standard in evaluating fidelity in controlled trials, is not feasible in large-scale UC implementation. Moreover, measuring fidelity across interventions that vary in content and focus presents a challenge when multiple EBPs are adopted.

Therefore, we assess a related construct to characterize sustained use of a practice—EBP concordant care. We depart from protocol-specific fidelity measures that include features specific to manualized protocols that do not necessarily generalize across interventions within a family of treatments shown to be effective for ameliorating a MH condition. EBP concordant care indexes the degree to which a therapist’s practice resembles the essential strategies one would expect within an evidence-based protocol for a given problem focus. Drawing from established tools used to characterize UC therapist practice, we developed the EBP Concordant Care Assessment (ECCA). The ECCA will provide a common metric to assess the extent to which a therapist delivers practices considered “essential” in EBPs for six major child MH targets: anxiety, depression, conduct problems, trauma, attachment problems, and substance use. This measure is completed by therapists and will be validated against observational coding.

Practice volume and penetration

Administrative claims data from LACDMH will be used to measure the volume and penetration of the six practices over time using a variety of units of analysis from FY10-11 through FY17-18. The volume of any given practice is defined as the numbers of agencies continuing to be reimbursed for the practice, therapists continuing to claim to the practice, unique clients served by the practice, and units of services being provided within each practice. Within PEI, providers must submit claims to be reimbursed for any service rendered and each claim must specify the practice. Penetration is calculated as the volume (claims, clients, therapists, agencies) for a given practice divided by the total volume across practices. Claims data will index volume and penetration for 8 years from the outset of the PEI transformation.

Aim two: Use mixed methods to characterize inner context factors and early implementation conditions that potentially predict EBP sustainment

The ultimate goal of this study is to identify factors associated with the long-term sustainment of multiple practices introduced in a major system reform toward EBP implementation. Challenges with this goal include the potentially time-varying nature of inner context factors and the time period of the study (i.e., assessing these factors after initial implementation). Thus, we use a multi-method approach including repeated measures of potentially time-varying inner context factors and analyses of archival data on early implementation conditions. We apply a mixed quantitative and qualitative approach since their combined use provides a better understanding of the implementation context than either approach alone [46]. Qualitative methods (interviews and archival document review) are used to obtain in-depth understanding of the conditions associated with success in sustainment while quantitative methods (survey methods) test hypotheses based on existing models of implementation. The key constructs of interest include agency-level organizational factors (Agency Structure, Climate, Leadership, and Early Implementation Conditions) and therapist-level attitudes toward and adaptations to practices.

Aim Three: Identify inner context and early implementation conditions that determine sustainment outcomes

We will use multi-level modeling to examine predictors of volume/penetration trajectories and end point estimates of therapist delivery of EBP concordant care. We will examine whether individual determinants interact to impact sustainment outcomes and whether determinants differ based on the type of sustainment outcome (i.e., EBP concordant care vs. volume/penetration).

Methods/design

Existing data sources

Claims data

Administrative claims data will index PEI-reimbursed claims for the six practices from the outset of the reform in FY 2009–2010 through the end of the study in FY 2018–2019. Data obtained from 2009–2014 (19 fiscal quarters) have yielded over 2.3 million psychotherapy claims for more than 87,000 unique child and transition age youth (TAY) clients provided by more than 8500 MH clinicians within 94 agencies (Brookman-Frazee et al., under review). Gross volume and penetration will be calculated at each level (claims, client, therapist, agency) per fiscal quarter for each practice. Claims data for each practice may have up to a four-level structure with claims nested within fiscal quarter, nested within therapists, within agencies. Because some therapists will transition across agencies within LACDMH over time, cross-classified models will be use to account for multiple membership nesting [47]. Claims will also be indexed at the therapist level to characterize the volume of claims per therapist for each practice relative to their total claims across practices.

Document extraction

Since the study commenced years after the PEI transformation launched, prospective assessment of early implementation conditions was not possible. However, readiness for implementation and early implementation integrity are two factors that may predict our sustainment outcomes of interest. As such, we will extract information from detailed documentation generated by Technical Assistance Site Visits that were carried out by the LACDMH from June 2012 through December 2013. The purpose of the visits was to support agencies in claiming procedures and to assess implementation milestones (e.g., therapist training, instantiation of supervision/consultation/fidelity monitoring, outcome tracking). Agencies completed a Pre-Site Visit Questionnaire (PSVQ) prior to site visits which included a 3-h meeting led by LACDMH staff and agency program managers, with supervisors, and therapists. Site Visit Reports (SVRs) summarizing discussions were produced within 3 weeks of each visit. The PSVQs and SVRs will be coded to provide a measure of early implementation conditions at the agency level.

Original data collection

Surveys

Survey data will be collected from therapists and program managers across agencies contracted to provide at least one of the six practices of interest to children or transition age youth. Therapists currently billing for psychotherapy services for at least one of them are eligible. We will enumerate our therapist sample by contacting program managers at each contracted agency to request email contacts for all eligible therapists. Participants will be invited to participate with a personalized link to the online survey via email. The sampling frame for the survey in fiscal year 2013–2014 included 102 agencies; we anticipate an agency-level response rate of 75 %, yielding a sample of 75 agencies. We anticipate an average of 18–24 eligible therapists per agency and a response rate of 50 % at the therapist level, to yield a survey sample of approximately 800 therapists.

Table 1 (see online resources) lists survey measures of sustainment outcomes and agency- and therapist-level predictors of sustainment to be included in therapist and program manager surveys. All established measures have strong evidence of reliability and validity. Two surveys will be fielded over the course of the study in years 2 and 4 to provide repeated measures of the inner context determinants of sustainment (organizational and therapist characteristics). These data will permit examination of the stability of inner context factors and will provide a more temporally proximal prediction of sustainment outcomes during the final year of the study.
Table 1

4KEEPS key constructs and measures as predictors of sustainment outcomes

Measure

Subscales

Sample items

Citations

Th

PM

Therapist attitudes

  

Perceived Characteristics of Intervention Scale (PCIS)

Relative advantage, compatibility, complexity, potential for reinvention

(Practice) is more effective than other therapies I have used.

Cook JM, Thompson R, Schnurr PP. Perceived characteristics of intervention scale: development and psychometric properties. Assessment. 2015;22(6):704–14. doi:10.1177/1073191114561254.

x

x

Evidence-Based Practice Attitude Scale (EBPAS)

Divergence, openness, limitations (practice-specific)

I like to use new types of therapy/interventions to help my clients.

Aarons GA, Glisson C, Hoagwood K, Kelleher K, Landsverk J, Cafri G, et al. Psychometric properties and U.S. national norms of the Evidence-Based Practice Attitude Scale (EBPAS). Psychol Assess. 2010;22(2):356–65. doi:10.1037/a0019188.

x

 

(Practice) detracts from truly connecting with my clients.

Knowledge and confidence

 

I am well prepared to deliver (Practice) even with challenging clients.

Project developed

x

 

I am confident in my ability to implement (Practice).

Implementation support

  

Practice-specific implementation support

Rate the frequency that the following are part of ongoing supervision for:

a.Live observation

b.Review of recorded session

c.Case discussion

d.Review of client progress monitoring (outcomes, dashboards) …

Project developed

x

x

Organizational context

  

Organizational climate measure

Autonomy, Rational Goal (Performance Feedback), HR Involvement (Therapists)

Program managers and/or agency leaders let people make their own decisions much of the time.

Patterson MG, West MA, Shackleton VJ, Dawson JF, Lawthom R., Maitlis S, et al. Validating the organizational climate measure: links to managerial practices, productivity and innovation. J Organ Behav. 2005;26:379–408. doi:10.1002/job.312.

x

x

Organizational readiness to change

Staffing, Cohesion, Stress

Frequent therapist turnover is a problem for your program.

Lehman WEK, Greener JM, Simpson DD. Assessing organizational readiness for change. J Subst Abuse Treat. 2002;22:197–209. doi:10.1016/S0740-5472(02)00233-7

 

x

Maslach Burnout Inventory (adapted)

Exhaustion, Personal Accomplishment

I feel emotionally drained from my work.

Schaufeli WB, Leiter MP. Maslach Burnout Inventory–general survey. In: The Maslach burnout inventory-test manual. 1996. p. 19–26.

x

 

I feel I’m positively influencing the lives of my clients through my work.

Therapist adaptations

  

Adaptations of evidence-based practices

I modify how I present or discuss the components of (Practice).

Stirman SW, Miller CJ, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implement Sci. 2013;8(65):1–12. doi:10.1186/1748-5908-8-65.

x

 

I apply (Practice) to novel populations.

Barriers to implementation (client-level)

  

Emergent life event (ELE) and engagement measure

Did the client or client’s caregiver raise a stressful life event that was not the intended focus of session? If yes, to what extent were you able to carry out your intended activities or return to the focus of session?

Project developed

x

 

In-depth study

Multi-method data including surveys, qualitative interviews, and behavioral observations will be collected in a subset of agencies. The objectives of the in-depth component of the 4KEEPS study are twofold. First, it will furnish qualitative interviews essential to our mixed methods (QUAN + QUAL) design, with quantitative data and qualitative data given equal weight. Combined analyses will be used for convergence (i.e., triangulation of data across methods to determine if the same conclusion is reached) and complementarity (i.e., qualitative data provide depth of understanding while quantitative data provide breadth of understanding). Second, the in-depth study will allow us to validate therapist self-reports of EBP concordant care against behavioral observations.

The sampling frame for the in-depth study will be the agencies enumerated into the survey sample described above. We will target a sample of 120 therapists and at least one program manager at each agency providing administrative oversight for PEI practices. Therapist interview guides will focus on training experiences, attitudes toward practices, and adaptations made to practices. Program manager interview guides will focus on barriers and facilitators of implementation of practices, the impact of adopting practices, and, when relevant, reasons for de-adoption of practices. Furthermore, therapists will provide self-reports of practice implementation on the ECCA and supply audio recordings of three therapy sessions for three clients.

In-depth study measure development activities

ECCA development

Multiple inputs have informed the development of a generic measure of EBP concordant care in the treatment of six child MH problem areas: anxiety, attachment problems, conduct, depression, substance abuse, and trauma addressed by the six practices of interest. The ECCA will include a common set of items regardless of what practice the therapist indicates they are using, with separate summary scores for each family of interventions. Item content was adapted from existing practice inventories [4851], and additional items were derived from intervention materials for practices not reflected in existing inventories. Practice experts, including intervention developers (i.e. authors of treatment manuals), and master trainers were enlisted in the item development process and in the determination of EBP concordance for the treatment of child MH target areas. For validation, we will examine the correspondence between therapist self-report on the ECCA and observer ratings at the session level using a parallel ECCA observational coding system.

Therapist adaptations of practices

To develop a measure of therapist adaptations to practices, we will use mixed methods sequentially (QUAN → qual → QUAN). In the first therapist survey (year 2), we will collect quantitative data to characterize therapist adaptations on a large scale using a questionnaire adapted from the Stirman et al. [52, 53] framework of adaptations and modifications of EBPs. Qualitative data from therapist interviews will later inform the refinement of the second therapist survey (year 4), using a Rapid Assessment Procedure (RAP) designed to provide depth and specificity in quantitative measures for assessing observed phenomena [54].

Discussion

The 4KEEPS project will yield new understanding of whether and how multiple EBPs can be sustained in public MH systems undergoing policy-driven reform. This project is significant for many reasons. First, conducting an observational study of UC implementation of this magnitude maximizes the representativeness of agencies and therapists and the resultant generalizability of findings. Most studies have focused on investigator-driven implementation, whereas PEI was driven by system-level stakeholders within a major policy reform. Unlike experimental studies, observational studies of system change are less likely to include only the ideal organizations with the greatest motivation and readiness to implement. There is greater independence from EBP developers and their implementation “superstructures” when studying implementation as usual [55]. The outer context policy change is likely to be a model for EBP implementation in public MH systems. Yet we know little about how such policies may unfold in the years following adoption.

Second, examining the sustainment of multiple practices is essential because large-scale EBP reform is not likely to involve dissemination of single interventions because any single EBP will not address all client MH needs. Chorpita et al. [12] found that a minimum of eight EBPs would be required to cover the range of problems child MH problems represented in LACDMH. The PEI transformation provides an opportunity to study the determinants of sustainment across multiple EBPs in a large, organizationally diverse, and ethnically diverse service system. In this context, penetration of a practice is relative to other available practices. This shifts the focus away from questions about whether a given intervention is likely to be sustained toward identifying characteristics of practices and their fit with organizations that promote sustainment within a marketplace of dissemination.

Third, methodological features of the study are important for the advancement of sustainment research. Metrics of practice volume and penetration and EBP concordant are necessary to assess the public health impact promised by dissemination of EBPs. Using generalizable methods, the study will produce feasible metrics for assessing therapist adaptations and integrity of EBP implementation across practices. Ultimately, the findings will inform the development of implementation interventions to promote sustained delivery of EBPs to maximize their public health impact.

Abbreviations

4KEEPS: 

Knowledge Exchange on Evidence-based Practice Sustainment

EBP: 

evidence-based practice

LACDMH: 

Los Angeles County Department of Mental Health

MHSA: 

Mental Health Services Act

PEI: 

Prevention and Early Intervention

ECCA: 

Evidence-based Practice (EBP) Concordant Care Assessment

CBITS: 

Cognitive Behavioral Interventions for Trauma in Schools

CPP: 

Child-Parent Psychotherapy

MAP: 

Managing and Adapting Practices

SS: 

Seeking Safety

TF-CBT: 

Trauma-Focused Cognitive Behavior Therapy

Triple P: 

Triple P Positive Parenting Program

Declarations

Acknowledgements

We appreciate the support that the Los Angeles County Department of Mental Health (LACDMH) has provided for this project and the substantial contributions of Lillian Bando and her colleagues in the LACDMH PEI Implementation Unit and Debbie Innes-Gomberg and her team in the Mental Health Services Act (MHSA) Implementation Unit. Funding for this research project was supported by the following grants from NIMH: R01 MH100134 and R01 MH100134-S. We would like to thank the following experts who provided their time and input in the development of this project: Drs. David Chambers, Gregory Aarons, John Landsverk, Lawrence Palinkas, and Ann Garland. We also acknowledge the expert faculty of the Implementation Research Institute (IRI), at the George Warren Brown School of Social Work, Washington University in St. Louis, where both AL and LBF were fellows and where the idea for the study was conceived. The IRI was funded by the NIMH (R25 MH080916 PI: Enola Proctor).

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Authors’ Affiliations

(1)
Department of Psychology, University of California, Los Angeles, USA
(2)
Department of Psychiatry, University of California, San Diego, USA
(3)
Child and Adolescent Services Research Center, San Diego, USA

References

  1. Garland AF, Bickman L, Chorpita BF. Change what? Identifying quality improvement targets by investigating usual mental health care. Adm Policy Ment Health. 2010;37(1):15–26. doi:10.1007/s10488-010-0279-y.View ArticlePubMedPubMed CentralGoogle Scholar
  2. Weisz JR, Jensen-Doss A, Hawley KM. Evidence-based youth psychotherapies versus usual clinical care: a meta-analysis of direct comparisons. Am Psychol. 2006;61(7):671–89. doi:10.1037/0003-066X.61.7.671.View ArticlePubMedGoogle Scholar
  3. Hoagwood KE, Olin SS, Horwitz S, McKay M, Cleek A, Gleacher A, et al. Scaling up evidence-based practices for children and families in New York State: toward evidence-based policies on implementation for state mental health systems. J Clin Child Adolesc Psychol. 2014;43(2):145–57. doi:10.1080/15374416.2013.869749.View ArticlePubMedPubMed CentralGoogle Scholar
  4. McHugh RK, Barlow DH. The dissemination and implementation of evidence-based psychological treatments. A review of current efforts. Am Psychol. 2010;65(2):73–84. doi:10.1037/a0018121.View ArticlePubMedGoogle Scholar
  5. Nakamura BJ, Higa-McMillan CK, Okamura KH, Shimabukuro S. Knowledge of and attitudes towards evidence-based practices in community child mental health practitioners. Adm Policy Ment Health. 2011;38(4):287–300. doi:10.1007/s10488-011-0351-2.View ArticlePubMedGoogle Scholar
  6. Starin AC, Atkins MS, Wehrmann KC, Mehta T, Hesson-McInnis MS, Marinez-Lora A, et al. Moving science into state child and adolescent mental health systems: Illinois’ evidence-informed practice initiative. J Clin Child Adolesc Psychol. 2014;43(2):169–78. doi:10.1080/15374416.2013.848772.View ArticlePubMedGoogle Scholar
  7. Trupin E, Kerns S. Introduction to the special issue: legislation related to children’s evidence-based practice. Adm Policy Ment Health. 2015:1–5. doi:10.1007/s10488-015-0666-5.
  8. Chorpita BF, Yim LM, Donkervoet JC, Arensdorf A, Amundsen MJ, McGee C, et al. Toward large-scale implementation of empirically supported treatments for children: a review and observations by the Hawaii Empirical Basis to Services Task Force. Clin Psychol Sci Pr. 2002;9(2):165–90. doi:10.1093/clipsy/9.2.165.View ArticleGoogle Scholar
  9. Cooper JL, Aratani Y, Knitzer J, Douglas-Hall A, Masi R, Banghart PL et al. Unclaimed children revisited: the status of children’s mental health policy in the United States. Columbia University, National Center for Children in Poverty;2008. http://hdl.handle.net/10022/AC:P:8917. Accessed 29 Jan 2016.
  10. Scheirer MA, Dearing JW. An agenda for research on the sustainability of public health programs. Am J Public Health. 2011;101(11):2059–67. doi:10.2105/AJPH.2011.300193.View ArticlePubMedPubMed CentralGoogle Scholar
  11. Chorpita BF, Daleiden EL. Mapping evidence-based treatments for children and adolescents: application of the distillation and matching model to 615 treatments from 322 randomized trials. J Consult Clin Psychol. 2009;77(3):566–79. doi:10.1037/a0014565.View ArticlePubMedGoogle Scholar
  12. Chorpita BF, Bernstein A, Daleiden EL. Empirically guided coordination of multiple evidence-based treatments: an illustration of relevance mapping in children’s mental health services. J Consult Clin Psychol. 2011;79(4):470–80. doi:10.1037/a0023982.View ArticlePubMedGoogle Scholar
  13. Los Angeles County Department of Mental Health. About DMH. http://dmh.lacounty.gov/wps/portal/dmh/aboutdmh. Accessed 29 Jan 2016.
  14. Los Angeles County Department of Mental Health. Status Report on the MHSA PEI Plan. 2011. http://file.lacounty.gov/dmh/cms1_165911.pdf. Accessed 29 Jan 2016.
  15. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38(1):4–23. doi:10.1007/s10488-010-0327-7.View ArticlePubMedPubMed CentralGoogle Scholar
  16. Glisson C, Schoenwald SK. The ARC organizational and community intervention strategy for implementing evidence-based children’s mental health treatments. Ment Health Serv Res. 2005;7(4):243–59. doi:10.1007/s11020-005-7456-1.View ArticlePubMedGoogle Scholar
  17. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38(2):65–76. doi:10.1007/s10488-010-0319-7.View ArticlePubMedPubMed CentralGoogle Scholar
  18. Stirman SW, Kimberly J, Cook N, Calloway A, Castro F, Charns M. The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implement Sci. 2012;7:17. doi:10.1186/1748-5908-7-17.View ArticleGoogle Scholar
  19. Chambers DA, Rupp A. Sharing state mental health data for research: building toward ongoing learning in mental health care systems. Adm Policy Ment Health. 2015;42(5):586–7. doi:10.1007/s10488-015-0624-2.View ArticlePubMedGoogle Scholar
  20. Hoagwood KE, Essock S, Morrissey J, Libby A, Donahue S, Druss B, et al. Use of pooled state administrative data for mental health services research. Adm Policy Ment Health. 2015;43(1):67–78. doi:10.1007/s10488-014-0620-y.View ArticlePubMed CentralGoogle Scholar
  21. Schoenwald SK, Garland AF, Chapman JE, Frazier SL, Sheidow AJ, Southam-Gerow MA. Toward the effective and efficient measurement of implementation fidelity. Adm Policy Ment Health. 2011;38(1):32–43. doi:10.1007/s10488-010-0321-0.View ArticlePubMedPubMed CentralGoogle Scholar
  22. Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: the evidence-based practice attitude scale (EBPAS). Ment Health Serv Res. 2004;6(2):61–74. doi:10.1023/B:MHSR.0000024351.12294.65.View ArticlePubMedPubMed CentralGoogle Scholar
  23. Aarons GA, Sommerfeld DH, Walrath-Greene CM. Evidence-based practice implementation: the impact of public versus private sector organization type on organizational support, provider attitudes, and adoption of evidence-based practice. Implement Sci. 2009;4:83. doi:10.1186/1748-5908-4-83.View ArticlePubMedPubMed CentralGoogle Scholar
  24. Aarons GA, Sawitzky AC. Organizational culture and climate and mental health provider attitudes toward evidence-based practice. Psychol Serv. 2006;3(1):61–72. doi:10.1037/1541-1559.3.1.61.View ArticlePubMedPubMed CentralGoogle Scholar
  25. Klein KJ, Conn AB, Sorra JS. Implementing computerized technology: an organizational analysis. J Appl Psychol. 2001;86(5):811–24. doi:10.1037/0021-9010.86.5.811.View ArticlePubMedGoogle Scholar
  26. Glisson C. The organizational context of children’s mental health services. Clin Child Fam Psychol Rev. 2002;5(4):233–53. doi:10.1023/A:1020972906177.View ArticlePubMedGoogle Scholar
  27. Palinkas LA, Aarons GA. A view from the top: executive and management challenges in a statewide implementation of an evidence-based practice to reduce child neglect. Int J Child Health Hum Dev. 2009;2(1):47–55.Google Scholar
  28. Aarons GA, Palinkas LA. Implementation of evidence-based practice in child welfare: service provider perspectives. Adm Policy Ment Health. 2007;34(4):411–9. doi:10.1007/s10488-007-0121-3.View ArticlePubMedGoogle Scholar
  29. Aarons GA, Sommerfeld DH. Leadership, innovation climate, and attitudes toward evidence-based practice during a statewide implementation. J Am Acad Child Adolesc Psychiatry. 2012;51(4):423–31. doi:10.1016/j.jaac.2012.01.018.View ArticlePubMedGoogle Scholar
  30. Aarons GA, Sommerfeld DH, Hecht DB, Silovsky JF, Chaffin MJ. The impact of evidence-based practice implementation and fidelity monitoring on staff turnover: evidence for a protective effect. J Consult Clin Psychol. 2009;77(2):270–80. doi:10.1037/a0013223.View ArticlePubMedPubMed CentralGoogle Scholar
  31. Casper ES. The theory of planned behavior applied to continuing education for mental health professionals. Psychiatr Serv. 2007;58(10):1324–9. doi:10.1176/ps.2007.58.10.1324.View ArticlePubMedGoogle Scholar
  32. Reding ME, Chorpita BF, Lau AS, Innes-Gomberg D. Providers’ attitudes toward evidence-based practices: is it just about providers, or do practices matter, too? Adm Policy Ment Health. 2014;41(6):767–76. doi:10.1007/s10488-013-0525-1.View ArticlePubMedPubMed CentralGoogle Scholar
  33. Aarons GA, Green AE, Palinkas LA, Self-Brown S, Whitaker DJ, Lutzker JR, et al. Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implement Sci. 2012;7:32. doi:10.1186/1748-5908-7-32.View ArticlePubMedPubMed CentralGoogle Scholar
  34. Stirman SW, Gutner CA, Crits-Christoph P, Edmunds J, Evans AC, Beidas RS. Relationships between clinician-level attributes and fidelity-consistent and fidelity-inconsistent modifications to an evidence-based psychotherapy. Implement Sci. 2015;10:115. doi:10.1186/s13012-015-0308-z.View ArticleGoogle Scholar
  35. Southam-Gerow MA, Ringeisen HL, Sherrill JT. Integrating interventions and services research: progress and prospects. Clin Psychol Sci Pr. 2006;13(1):18. doi:10.1111/j.1468-2850.2006.00001.x.View ArticleGoogle Scholar
  36. Aarons GA, Cafri G, Lugo L, Sawitzky A. Expanding the domains of attitudes towards evidence-based practice: the evidence based practice attitude scale-50. Adm Policy Ment Health. 2010;39(5):331–40. doi:10.1007/s10488-010-0302-3.View ArticlePubMed CentralGoogle Scholar
  37. Aarons GA, Glisson C, Hoagwood K, Kelleher K, Landsverk J, Cafri G. Psychometric properties and U.S. national norms of the evidence-based practice attitude scale (EBPAS). Psychol Assess. 2010;22(2):356–65. doi:10.1037/a0019188.View ArticlePubMedGoogle Scholar
  38. Aisenberg E. Evidence-based practice in mental health care to ethnic minority communities: has its practice fallen short of its evidence? Soc Work. 2008;53(4):2970306. doi:10.1093/sw/53.4.297.View ArticleGoogle Scholar
  39. Bernal G, Scharró-del-Río MR. Are empirically supported treatments valid for ethnic minorities? Toward an alternative approach for treatment research. Cultur Divers Ethnic Minor Psychol. 2001;7(4):328–42. doi:10.1037/1099-9809.7.4.328.View ArticlePubMedGoogle Scholar
  40. Lau AS. Making the case for selective and directed cultural adaptations of evidence-based treatments: examples from parent training. Clin Psychol Sci Pr. 2006;13(4):295–310. doi:10.1111/j.1468-2850.2006.00042.x.View ArticleGoogle Scholar
  41. Forehand R, Dorsey S, Jones DJ, Long N, McMahon RJ. Adherence and flexibility: they can (and do) coexist! Psychological Services. 2010;17(3):258–64. doi:10.1111/j.1468-2850.2010.01217.x.Google Scholar
  42. Mazzucchelli TG, Sanders MR. Facilitating practitioner flexibility within an empirically supported intervention: lessons from a system of parenting support. Clin Psychol Sci Pr. 2010;17(3):238–52. doi:10.1111/j.1468-2850.2010.01215.x.View ArticleGoogle Scholar
  43. Dusenbury L, Brannigan R, Falco M, Hansen WB. A review of research on fidelity of implementation: implications for drug abuse prevention in school settings. Health Educ Res. 2003;18(2):237–56. doi:10.1093/her/18.2.237.View ArticlePubMedGoogle Scholar
  44. Chambers DA, Wang PS, Insel TR. Maximizing efficiency and impact in effectiveness and services research. Gen Hosp Psychiatry. 2010;32(5):453–5. doi:10.1016/j.genhosppsych.2010.07.011.View ArticlePubMedGoogle Scholar
  45. Schoenwald SK, Garland AF, Southam-Gerow MA, Chorpita BF, Chapman JE. Adherence measurement in treatments for disruptive behavior disorders: pursuing clear vision through varied lenses. Clin Psychol Sci Pr. 2011;18(4):331–41. doi:10.1111/j.1468-2850.2011.01264.x.View ArticleGoogle Scholar
  46. Palinkas LA, Horwitz SM, Chamberlain P, Hurlburt MS, Landsverk J. Mixed-methods designs in mental health services research: a review. Psychiatr Serv. 2011;62(3):255–63. doi:10.1176/appi.ps.62.3.255.View ArticlePubMedGoogle Scholar
  47. Beretvas SN. Cross-classified and multiple-membership models. In: Hox JJ, Roberts JK, editors. Handbook for advanced multilevel analysis. New York: Routledge; 2011. p. 313–34.Google Scholar
  48. McLeod BD, Weisz JR. The therapy process observational coding system for child psychotherapy-strategies scale. J Clin Child Adolesc Psychol. 2010;39(3):436–43. doi:10.1080/15374411003691750.View ArticlePubMedGoogle Scholar
  49. Garland AF, Brookman-Frazee L, Hurlburt MS, Accurso EC, Zoffness RJ, Haine-Schlagel R, et al. Mental health care for children with disruptive behavior problems: a view inside therapists’ offices. Psychiatr Serv. 2010;61(8):788–95. doi:10.1176/appi.ps.61.8.788.View ArticlePubMedPubMed CentralGoogle Scholar
  50. Child & Adolescent Mental Health Division. Child and adolescent mental health performance standards. Hawa’ii Department of Health, Child & Adolescent Mental Health Division, Honolulu, HI. 2012. http://health.hawaii.gov/camhd/files/2013/07/Orange-Book.pdf. Accessed 2 Feb 2016.
  51. Orimoto TE, Higa-McMillan CK, Mueller CW, Daleiden EL. Assessment of therapy practices in community treatment for children and adolescents. Psychiatr Serv. 2012;63(4):343–50. doi:10.1176/appi.ps.201100129.View ArticlePubMedGoogle Scholar
  52. Stirman SW, Miller CJ, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implement Sci. 2013;8:65. doi:10.1186/1748-5908-8-65.View ArticlePubMedPubMed CentralGoogle Scholar
  53. Stirman SW, Calloway A, Toder K, Miller CJ, Devito AK, Meisel SN, et al. Community mental health provider modifications to cognitive therapy: implications for sustainability. Psychiatr Serv. 2013;64(10):1056–9. doi:10.1176/appi.ps.201200456.View ArticleGoogle Scholar
  54. Palinkas LA, Prussing E, Reznik VM, Landsverk JA. The San Diego East County school shootings: a qualitative study of community-level post-traumatic stress. Prehosp Disaster Med. 2004;19(1):113–21. doi:10.1017/S1049023X00001564.PubMedGoogle Scholar
  55. Hogue A. When technology fails: getting back to nature. Clin Psychol Sci Pr. 2010;17(1):77–81. doi:10.1111/j.1468-2850.2009.01196.x.View ArticleGoogle Scholar

Copyright

© Lau and Brookman-Frazee. 2016

Advertisement