Open Access
Open Peer Review

This article has Open Peer Review reports available.

How does Open Peer Review work?

Testing the leadership and organizational change for implementation (LOCI) intervention in substance abuse treatment: a cluster randomized trial study protocol

  • Gregory A. Aarons1, 2Email author,
  • Mark G. Ehrhart3,
  • Joanna C. Moullin1, 2,
  • Elisa M. Torres1, 2 and
  • Amy E. Green1, 2
Implementation Science201712:29

DOI: 10.1186/s13012-017-0562-3

Received: 9 February 2017

Accepted: 24 February 2017

Published: 3 March 2017

Abstract

Background

Evidence-based practice (EBP) implementation represents a strategic change in organizations that requires effective leadership and alignment of leadership and organizational support across organizational levels. As such, there is a need for combining leadership development with organizational strategies to support organizational climate conducive to EBP implementation. The leadership and organizational change for implementation (LOCI) intervention includes leadership training for workgroup leaders, ongoing implementation leadership coaching, 360° assessment, and strategic planning with top and middle management regarding how they can support workgroup leaders in developing a positive EBP implementation climate.

Methods

This test of the LOCI intervention will take place in conjunction with the implementation of motivational interviewing (MI) in 60 substance use disorder treatment programs in California, USA. Participants will include agency executives, 60 program leaders, and approximately 360 treatment staff. LOCI will be tested using a multiple cohort, cluster randomized trial that randomizes workgroups (i.e., programs) within agency to either LOCI or a webinar leadership training control condition in three consecutive cohorts. The LOCI intervention is 12 months, and the webinar control intervention takes place in months 1, 5, and 8, for each cohort. Web-based surveys of staff and supervisors will be used to collect data on leadership, implementation climate, provider attitudes, and citizenship. Audio recordings of counseling sessions will be coded for MI fidelity. The unit of analysis will be the workgroup, randomized by site within agency and with care taken that co-located workgroups are assigned to the same condition to avoid contamination. Hierarchical linear modeling (HLM) will be used to analyze the data to account for the nested data structure.

Discussion

LOCI has been developed to be a feasible and effective approach for organizations to create a positive climate and fertile context for EBP implementation. The approach seeks to cultivate and sustain both effective general and implementation leadership as well as organizational strategies and support that will remain after the study has ended. Development of a positive implementation climate for MI should result in more positive service provider attitudes and behaviors related to the use of MI and, ultimately, higher fidelity in the use of MI.

Trial registration

This study is registered with Clinicaltrials.gov (NCT03042832), 2 February 2017, retrospectively registered.

Keywords

Implementation Implementation strategy Leadership Organizational climate Attitudes Fidelity Motivational interviewing

Background

Evidence-based practice (EBP) implementation represents a strategic change in organizations that requires effective leadership and the alignment of leadership and organizational support across organizational levels [1, 2]. Although there are many leadership development approaches, few are based on evidence of effectiveness, and to our knowledge, none are specifically designed to develop strategic climates for EBP implementation [3]. Changing the organizational context to support EBP is a critical challenge facing health and allied healthcare settings [48]. Leadership is a critical factor in developing an implementation climate [9] and improving attitudes [1013] toward EBP, as well as clinical outcomes such as client satisfaction and quality of life [14].

Effective EBP implementation to address complex and widespread public health issues, such as substance use disorders (SUDs), remains a significant challenge. Our ability to effectively implement EBPs is as important as the EBP itself because implementation efforts often fail to effectively institute innovations in organizations [1517]. Although there are process models to facilitate EBP intervention development and implementation [1820], there are few empirically tested organizational strategies to facilitate EBP implementation in substance use disorder treatment (SUDT) [5, 2022]. Rigorous testing of implementation strategies for SUDs is in its infancy, particularly for organizationally focused interventions. This study addresses these concerns and advances implementation science by testing the leadership and organizational change for implementation (LOCI) (pronounced lō - sī) intervention. The goals of LOCI are to improve general leadership and implementation leadership combined with the development and use of organizational strategies to create a positive strategic organizational climate to support EBP implementation and sustainment. As shown in Fig. 1, improved leadership, in combination with targeted and multilevel organizational strategies, is hypothesized to lead to improved leadership, implementation climate, and SUDT provider attitudes and behaviors that support the implementation process and implementation outcomes.
Fig. 1

Effects of LOCI on leadership, implementation and psychological safety climate, provider attitudes and citizenship, and implementation outcomes. We will compare LOCI vs. control on proximal and distal outcomes. Exploratory analyses will examine mediational and cross-level effects

This study draws on two leadership approaches, the full-range leadership (FRL) model and implementation leadership [23]. The FRL model is a validated approach to leadership for individual and organizational development [24, 25]. The FRL model includes transformational and transactional leadership and is measured with the reliable and valid Multifactor Leadership Questionnaire [26, 27]. Transformational leadership inspires and motivates employees to follow an ideal or course of action. Transactional leadership reflects a leader’s ability to manage and motivate staff through appropriate interactions and rewards [28]. Appropriate application of the FRL forms the foundation of effective leadership and may impact how employees accept the vision of the leader and follow through on job roles and tasks.

Implementation leadership involves behaviors that fall on four dimensions of being knowledgeable about the EBP being implemented, being proactive and anticipatory in problem-solving, supporting others in the implementation process, and persevering through the ups and downs of the implementation process [23]. The Implementation Leadership Scale assessing these dimensions has been validated for use in SUDT [29]. LOCI utilizes both FRL and implementation leadership as complementary leader skills and behaviors that can be utilized to help to develop a positive EBP implementation climate [13, 30, 31]. Thus, this project advances the application of leadership theory in implementation science.

In contrast to general organizational climate that supports the overall well-being of employees, a strategic climate supports a particular organizational purpose or goal [32, 33]. Implementation climate is a strategic climate defined as “employees’ shared perceptions of the importance of innovation implementation within the organization” [34]. Cross-level relationships between executive management, mid-management, and first-level leadership develop and support congruence of EBP support structures and processes, in a targeted and concerted strategy to improve implementation climate [1, 31].

LOCI development

LOCI is framed within the Exploration, Preparation, Implementation, Sustainment (EPIS) framework [21], focusing primarily on the inner organizational context and the Preparation and Implementation phases. As shown in Fig. 1, LOCI takes an active approach to improving leadership and congruent organizational strategies that lead to improved transformational and transactional leadership, implementation leadership, and subsequent implementation climate and psychological safety climate. These, in turn, are hypothesized to lead to changes in provider attitudes toward EBP, implementation citizenship behaviors, and to better EBP fidelity and implementation process [4, 3538]. Consistent with the EPIS conceptual framework, LOCI creates change at multiple levels within a provider organization (e.g., executives/mid-managers, workgroup supervisor, service provider) to foster a context supportive of EBP implementation and sustainment.

It is important to address leadership at the appropriate organizational levels. “First-level” leaders manage and supervise those providing direct services to clients and can be particularly important in influencing staff perceptions and behavior. Workgroup supervisors are first-level leaders and are important in effective workgroup functioning [39], yet public sector first-level leaders are often promoted from clinical or service positions without adequate training in leadership [13]. SUDT workgroup supervisors are frequently responsible for implementing innovations and meeting administrative and productivity requirements [40] but can be organizational “change agents” to inspire and motivate followers to implement change [41, 42]. They play a critical role in staff perceptions of support for using an innovation such as EBP [43]. Clinical workgroup supervisors are likely to be more effective with the buy-in and support of middle and upper organizational management [4447]. It is important to consider multiple levels within provider organizations when working to create a context supportive of EBP implementation and sustainment [21, 4850]. LOCI entails developing organizational strategies tailored to support first-level leaders while facilitating buy-in and support from upper and mid-level managements for the strategic goal of effective implementation through the creation of a positive EBP implementation climate. In LOCI, members of the research team support workgroup supervisors and management to develop a set of strategies to embed an EBP implementation climate and support EBP implementation with fidelity [2, 51].

The evidence-based practice being implemented

Motivational interviewing (MI) is a leading EBP for SUDT. MI was developed after a series of findings illustrated that therapist empathy during treatment, in contrast to the treatment intervention itself, accounted for a larger proportion of variance in SUDT outcomes including post-treatment relapse [52]. MI emphasizes using an empathic client-centered clinical style while evoking and strengthening the client’s own verbalized motivations for overcoming ambivalence about change [53]. In the spirit of MI, it is the patient’s role, as opposed to the clinician’s, to articulate and resolve their own ambivalence regarding change. Treatment providers achieving high levels of MI fidelity are informative, supportive, respectful, and collaborative; their patients are more satisfied, more likely to be retained in care, more committed to treatment regimens, and have better outcomes when compared to those whose providers do not use MI [5464].

Preliminary studies

With NIMH support (Grant No. R21MH082731), experts in leadership and implementation from business and management schools were engaged to work with the investigators and mental health program managers (i.e., first-level leaders) and an instructional design consultant to develop the training strategy and materials. Next, a pilot study involving 12 mental health program managers, randomly assigned to LOCI or a leadership webinar, was conducted. Results demonstrated feasibility, acceptability, perceived utility, and change in leadership behavior [1, 23, 65].

Significance

There are two main potential impacts of the proposed study. First, improving leadership and organizational climate for EBP will advance both implementation science and will provide an empirically tested implementation strategy to decrease the lag between intervention or innovation development and deployment and sustainment in usual care settings [66]. Demonstrating the efficacy of LOCI will provide service systems and health care organizations with a way to simultaneously improve leadership and institute organizational strategies to support and promote effective EBP implementation and sustainment. Second, although counselors and clinicians may receive training in MI, leaders and organizations often do not apply strategies to effectively implement MI or support staff to deliver MI with fidelity beyond initial training [67, 68].

To our knowledge, there are no other empirically tested workgroup level approaches that bring together leadership and organizational strategies to improve strategic climate for EBP and implementation effectiveness. In addition, this proposed study utilizes new reliable and valid measures of implementation leadership [23], implementation climate [30], and implementation citizenship [69] and will apply the Stages of Implementation Completion (SIC) measure [37] to assess LOCI and MI implementation process.

Methods

This study will test the effects of LOCI in facilitating MI implementation in SUDT service settings.

The study’s aims and hypotheses are:
  • Aim 1: Conduct a cluster, randomized trial to test the effects of LOCI vs. webinar control condition on full-range leadership and implementation leadership behaviors.
    • H1a: Provider-rated FRL will improve more in the LOCI vs. control condition.

    • H1b: Provider-rated implementation leadership will improve more in the LOCI vs. control condition.

  • Aim 2: Test the effect of LOCI on workgroup level implementation climate and psychological safety climate, provider-level attitudes toward MI, and provider implementation citizenship behaviors.
    • H2a: Implementation climate will show greater improvement in LOCI vs. control leader workgroups.

    • H2b: Psychological safety climate will show greater improvement in LOCI vs. control leader workgroups.

    • H2c: Providers in LOCI vs. control condition will report more positive attitudes to MI.

    • H2d: Providers in LOCI vs. control condition will demonstrate greater implementation citizenship.

  • Aim 3: Test the effect of LOCI on implementation outcomes MI fidelity and implementation process.
    • H3a: Workgroups in LOCI vs. control condition will show greater improvement in MI fidelity.

    • H3b: Workgroups in LOCI vs. control condition will show greater implementation efficiency as measured by the SIC.

  • Aim 4: Explore mediational and cross-level effects (e.g., effects of workgroup level climate on provider attitudes) and test the effects of leadership and strategy on implementation climate, subsequent effects on attitudes toward EBP, and implementation citizenship. Example hypotheses include:
    • H4a: More positive workgroup level implementation climate will be associated with more positive provider-level attitudes toward EBP and implementation citizenship behaviors.

    • H4b: More positive workgroup level psychological safety climate will be associated with more positive provider-level EBP attitudes and implementation citizenship behaviors.

    • H4c: Implementation climate will mediate the effects of leadership on attitudes toward EBP and implementation citizenship behaviors.

    • H4d: Provider attitudes toward EBP and implementation citizenship will mediate the effects of implementation climate on fidelity.

Design

Multiple study designs were considered by the research team in consultation with collaborating agencies. In order to reduce participant burden, a multiple cohort design, cluster randomized trial that randomizes workgroups within agency to either LOCI or a webinar control was selected. Allocation was determined by the statistician on the project. Three consecutive cohorts of participants will be included. The LOCI intervention lasts for 12 months and the webinar control intervention takes place in months 1, 5, and 8 for each cohort. The unit of analyses will be workgroup, randomized by site within agency with care taken that co-located workgroups are assigned to the same condition to reduce the chances of contamination.

Setting

The study will take place with workgroups from SUDT agencies in California, USA. California is home to 38 million individuals in 58 counties encompassing urban and vast rural areas. The population is diverse (39.4% non-Hispanic White, 38.2% Hispanic, 14.4% Asian/Pacific Islander, 6.6% Black, 1.7% American Indian). A language other than English is spoken in 38.5% of households. Education levels are 80.8% high school graduate and 30.2% bachelor’s degree or higher, and 14.4% of persons have income below the poverty level.

Participants

Sample demographics for service providers and workgroup supervisors are expected to be approximately 65% female, 58.7% Caucasian, 27.4% Latino, 18.5% African-American, 2.6% Asian American/Pacific Islander, 2.1% Native American, and 18% “other.” Regarding education, we estimate 18.1% of staff will have a master’s degree, 28.7% college graduates, and 38.8% completed some college level coursework. The proposed settings and providers may also address clients’ co-occurring mental health problems. Consent forms will be available in English and Spanish, and there will also be the capacity for MI coding in both English and Spanish.

Recruitment

Executives who have agreed to allow recruitment at their agencies will first identify appropriate workgroups within their agencies (i.e., with opportunity to utilize MI) that will be offered the opportunity to enroll in the study. A workgroup is defined as all direct service providers (e.g., alcohol/drug counselors) who report directly to a single workgroup leader. Once workgroups are identified, executives at each agency will contact eligible workgroup supervisors to invite them to a group phone call with the investigative team to learn more about the study. All eligible participants will be given the opportunity to consent or decline participation in any of the research study components. Participants may cease participation in any part of the research study at any time.

Inclusion criteria

Participants for the proposed study will fall into three groups: (1) SUDT service providers (i.e., “service providers” n = 360), (2) SUDT workgroup supervisors (i.e., “workgroup supervisors” n = 60), and (3) SUDT agency executives and managers (i.e., “executives/managers” n = 60), total N = 480.

Exclusion criteria

Personnel not providing or supervising direct services (e.g., administrative staff) are not included as most of the measures will not be applicable to these staff. Participants must be at least 18 years of age and employed at one of the participating agencies. Supervisors who do not agree to participate in the leadership training (LOCI or webinar), and their staff, will not be eligible to participate.

LOCI intervention

(1) 360° assessment

Consenting workgroup supervisors, service providers who report to them, and executives/managers will participate in the quantitative web-based surveys. A pre-intervention baseline survey precedes intervention deployment for both conditions. All leadership measures will be assessed using a 360° assessment procedure in which leadership ratings are obtained from the leader him/herself, the leader’s subordinates, and the leader’s supervisor.

The 360° assessment data are synthesized into a detailed feedback report and used in the co-creation of a personal leadership development plan for each workgroup leader in the LOCI condition. The research team will only share the feedback reports with each individual LOCI workgroup supervisor. Workgroup supervisors will not be required to share their feedback reports with anyone, including superiors in their organization. Any feedback provided to organization management will utilize aggregate data so that no individual respondents can be identified.

(2) Leadership training

LOCI intervention

There are three main components of the LOCI intervention: (2.1) training, (2.2) coaching for first-level leaders (workgroup supervisors), and (2.3) tailoring organizational strategies to support the first-level leaders in developing an EBP climate.

2.1 Training

2.1.1 Initial leadership training: The LOCI intervention begins with a 2-day didactic and interactive session. This component includes group introductions, introduction to FRL and implementation leadership, identifying transformational, transactional, and implementation leadership behaviors, identifying behaviors that can be used to build a climate for EBP implementation, and group activities (e.g., breakout groups, interactive exercises, meals) to facilitate social interaction and learning consolidation. The training also addresses implementation climate and the nature of EBPs so that workgroup supervisors learn how to articulate a rationale for how and why EBPs can improve patient and client outcomes. Trainers and coaches work individually with each trainee in a co-creation process in reviewing their 360° assessment data, identifying strengths and areas for development, and setting a timeline for issues to be addressed immediately and those to be addressed later in coaching. Workgroup supervisors emerge with a data-based development plan including broad goals and specific action items that will guide coaching sessions throughout the remainder of the program.

2.1.2 Booster leadership training: Workgroup supervisors in the LOCI condition attend 1-day booster training sessions in 4 and 8 months after the initial training. Prior to each booster session, 360° assessments are completed for updating leadership development plans. Leadership principles, goals, and organizational strategies to support leadership are reinforced through group discussion and problem-solving.

2.1.3 Graduation: Graduation is a rite/ritual deliberately included in LOCI to mark completion of the program. Accomplishments of the participants are celebrated, challenges are processed, and future plans are shared.

2.2 Coaching

Weekly coaching calls are provided for each LOCI supervisor. Coaching calls range from 15–30 min in duration with the goal of keeping participants on track with their goals and development plans, and keeping LOCI principles and strategies in mind. The weekly coaching calls focus on tracking progress in development plans, updating plans based on emergent issues or needs, problem-solving, providing additional leadership support, and identifying organizational strategy needs. Monthly group calls with LOCI leaders are held to facilitate problem-solving and networking among LOCI workgroup supervisors and to discuss their progress and brainstorm solutions to obstacles encountered.

2.3 Organizational strategy meetings (OSMs)

LOCI facilitators meet concurrently with executives, managers, and LOCI workgroup supervisors (within agency) to develop, tailor, and adopt organizational strategies to support the first-level leader in creating an EBP implementation climate in their workgroup. These meetings are held in-person for the first meeting and then by phone for subsequent meetings. As a guiding heuristic, climate-embedding mechanisms (see Additional file 1) are utilized and tailored for each agency and workgroup [51]. The first strategy meeting occurs at each agency site to minimize burden on participants [1]. Follow-up meetings in 4 and 8 months will occur via a web conferencing platform or teleconference. Brief monthly check-in calls (15–30 min) occur with executives only, via teleconference.

Webinar condition

The webinar control condition was selected as it is practical and parallels typical time-limited leadership training that is provided for managers and supervisors in public sector service settings. Four webinar sessions will be provided from a well-known leadership training organization. The 1-h webinars focus on leading change and managing work teams. Workgroup supervisors in the webinar control condition will view the webinars at their convenience.

(3) MI training

All direct SUDT service providers and workgroup supervisors in both conditions are eligible to receive training in MI. This entails a one-time, 2-day training that occurs approximately 1 week to 1 month after the beginning of the LOCI leadership trainings. The first day addresses basic MI skills, and the second day addresses intermediate MI skills. MI training typically includes didactic training on topics such as the spirit and principles of MI and interactive training and interactive exercises in order to provide skill building and practice. MI trainers are Motivational Interviewing Network of Trainers (MINT) members.

(4) MI fidelity data collection

Participants will be asked to audio record and upload all of their sessions with clients where MI could be utilized. A digital audio recorder will be provided for each direct service provider. Although clients receiving the MI intervention may be heard on the audio recordings of these MI sessions, clients themselves will not be participants in the study. The research team will receive no identifying information about clients, and although no identifiable client information will be collected, informed consent will be sought from clients stating that they will allow the audio recording of their sessions. After audio recording MI sessions, participating service providers will log into a secure, HIPAA-compliant file-sharing website to upload each audio file. One randomly selected recording per counselor per month will be coded for fidelity, and a fidelity report for each provider will be provided back to the supervisor. Supervisors have the additional option of receiving individual training on providing fidelity feedback and coaching for direct service staff.

Data collection and management

Data are to be collected at baseline (prior to the interventions), 4, 8, 12, and 16 months (4 months after the end of the LOCI intervention). Organizational data will be collected from all participants (service providers, workgroup supervisors, and executives/mid-managers) via web surveys and downloaded into a database programmed for error identification and checking. Each type of respondent will complete a set of measures in their survey, and not all measures will be collected at each time point. The provider/supervisor surveys will take about 20–30 min to complete, whereas the survey for executives/mid-managers will take about 5–10 minutes per supervisee. Participants will be compensated with a $25 electronic gift card via email for completing each survey. A portion of the data collected in these surveys (i.e., FRL, implementation leadership, EBPAS, EBP implementation climate) will be shared with supervisors in the LOCI condition as a part of the leadership development intervention. Data from managers/executives will be shared with the supervisors, and this is comparable to a typical performance review. Data from service providers will be shared with their supervisors in an aggregate manner to protect the confidentiality of individual provider responses. All data will be stored on a secure server with incremental and weekly full back up.

Measures

Demographics

Data to be collected includes age, sex, education level, professional status (e.g., intern vs. professional), and job tenure [13].

Full-range leadership

The Multifactor Leadership Questionnaire (MLQ) [70] assesses transformational and transactional leadership. MLQ scores are associated with organizational climate and working alliance in behavioral health agencies [71] and predict organizational effectiveness. Service providers rate the extent to which their immediate supervisor engages in specific behaviors measured by the MLQ, and executives do the same for their supervisees. Each behavior is rated on a 5-point scale (0 = Not at all, 4 = Frequently, if not always). The MLQ has good to excellent psychometric properties with Cronbach’s alphas ranging from .76 to .90. Transformational leadership is assessed by four subscales: Idealized Influence (8 items, α = .87), Inspirational Motivation (4 items, α = .91), Intellectual Stimulation (4 items, α = .90), and Individual Consideration (4 items, α = .90). Transactional leadership is assessed with two subscales of Contingent Reward (4 items, α = .87) and Active Management-by-Exception (4 items, α = .74).

Implementation leadership

The Implementation Leadership Scale (ILS) is a brief measure of unit level leadership for EBP implementation with excellent reliability and convergent and discriminant validity [23]. The four ILS subscales are Proactive Leadership (α = .95), Knowledgeable Leadership (α = .96), Supportive Leadership (α = .95), and Perseverant Leadership (α = .96), and the total score is α = .98. Each item is scored on a 5-point Likert-type scale (0 = Not at all, 4 = To a very great extent).

Implementation climate

The Implementation Climate Scale (ICS) [30] was adapted from Klein and colleagues’ study of innovation implementation [31] and considering Schein’s construct of “embedding mechanisms” [2]. The ICS assesses employees’ shared perceptions of the policies, practices, procedures, and behaviors that are expected, supported, and rewarded to facilitate effective EBP implementation. All items are scored on a 5-point Likert-type scale (0 = Not at all to 4 = To a very great extent). The ICS has excellent internal consistency and convergent and discriminant validity. The ICS has an overall Cronbach’s alpha of .91 (18 items, 3 items on each subscale). The six subscales are Focus on EBP (α = .91), Educational Support for EBP (α = .84), Recognition for EBP (α = .88), Rewards for EBP (α = .81), Selection for EBP (α = .89), and Selection for Openness (α = .91).

Psychological safety climate

Psychological safety climate (PSC) will be assessed with the PSC scale [72], which assesses employees’ shared perceptions of organizational policies, procedures, and behaviors regarding a supportive, safe work environment for taking interpersonal risks. The PSC has seven items scaled from 0 (Doesn’t apply at all) to 4 (Entirely applies) with good internal consistency reliability (α = .82), construct, and concurrent validity. PSC scores are associated with workgroup membership, contextual support, and interactions with the workgroup supervisor.

Attitudes toward EBP

The Evidence-Based Practice Attitude Scale (EBPAS) [73] will assess individuals’ attitudes toward EBP. The EBPAS has 15 items with four subscales that assess attitudes toward adoption of EBP as a function of perceived Appeal of EBP, Requirements to use EBP, provider Openness, and perceived Divergence between EBP and usual care. EBPAS total scores (α = .76) represent global attitudes toward adoption of EBP and subscale alphas range from .66 to .91. EBPAS responses are scored on a 5-point scale (0 = Not at all, 4 = To a very great extent), and scores are associated with individual provider-level attributes and organizational characteristics [7375].

EBP Implementation Citizenship Behavior

The Implementation Citizenship Behavior Scale (ICBS) [69] will assess individual EBP implementation citizenship. The ICBS was adapted from an existing measure of safety citizenship in the workplace [76] and assesses the extent to which individual workgroup members go beyond minimum requirements to support successful EBP implementation in regard to helping others and keeping informed. The ICBS is comprised of 6 items that are scored on a 5-point scale (0 = Not at all, 4 = Frequently, if not always). The ICBS has demonstrated excellent internal consistency reliability for the Helping (α = .93, 3 items), Keeping Informed (α = .91, 3 items), and ICBS Total EBP Citizenship scales (α = .93).

Assessment of Prior MI Use

The Assessment of Prior MI Use measure was developed for this study to measure the extent to which service providers have been exposed to MI (i.e., previoustraining and use of MI with clients) prior to the MI training offered as a part of the study. All participants will be asked about prior training and current use of MI, in addition to study data on number of recordings and session uploads, to aid in determining penetration at provider and client levels. Those with prior experience are asked to estimate the percentage of clients with whom they use MI and the extent to which they use MI with fidelity. These data can be used as provider-level covariates in quantitative analyses.

LOCI Component Assessment

The LOCI Component Assessment was developed to increase understanding of the relative importance and utility of each of the six components of LOCI. It measures the extent to which LOCI trainees perceive each component to be important and useful in achieving change in leadership and improving EBP implementation.

MI Coach Rating Scale

The MI Coaching Rating Scale (MI-CRS) [77] will be used to assess fidelity. This scale was developed based on the MI Treatment Integrity Instrument (MITI) [78] that has demonstrated inter-rater reliability and differentiates between MI and usual care [79]. The MI-CRS was developed utilizing Item Response Theory [80, 81] and is designed to be useful for research and practical for agencies to utilize for their own in-house fidelity monitoring. Five items address the relational components of MI, and 5 items address the technical components of MI, with each item scored on a 4-point scale (1 = Poor, 2 = Fair, 3 = Good, 4 = Excellent). A mean of 3.5 and above on a given component is considered solid competence, a mean of 2.5 to 3.5 is considered beginner competence, and a mean score of less than 2.5 is considered below competence. Component scores will be averaged to compute a total mean fidelity score. Research staff will use the MI-CRS to code audio recordings of client sessions in which MI is used.

Stages of Implementation Completion

The stages of implementation completion (SIC) measure will be used to assess implementation process [82]. It assesses activities that occur during eight stages of the measure, which fit within the EPIS implementation phases of Preparation (e.g., SIC pre-implementation) and Implementation. The SIC will be adapted for MI and LOCI to assess each workgroup’s progress toward successful MI implementation in consultation with the measure developer, and completed by the investigative team based on ongoing documentation of the dates at which each SIC activity and stage is completed. Three scores are calculated: (1) a duration score (i.e., time in days that a site takes in a stage), (2) proportion score (i.e., percentage of activities completed within a stage), and (3) the SIC stage score (i.e., number of stages completed). The SIC has demonstrated construct and predictive validity and identifies and predicts variation in implementation process [83, 84].

Assessment of Climate Embedding Mechanisms (ACEM)

The ACEM was developed for this study and will be used to measure the organizational strategies developed and used by each organization and leader trainee. The first part of the measure includes qualitative items to identify all strategies that are being used within a workgroup, the larger organization, and across levels to embed a climate for the implementation of MI. For each strategy identified, quantitative items measure each strategy’s frequency of use and level of emphasis. The result is a list of strategies used within each workgroup and organization to support the implementation of MI.

LOCI Feasibility, Acceptability, Utility

The LOCI Feasibility, Acceptability, Utility scale [1] is an 11-item measure developed for the LOCI pilot study to assess the LOCI training.

Data analyses

The study will obtain estimates of the leadership intervention effect by comparing multiple intervention measures on provider-rated leadership, workplace climate, work attitudes, and attitudes toward adopting and implementing EBPs. The workgroup will be the unit of analysis, with sites randomized within agency to ensure that the 5% of co-located workgroups will be randomized into the same condition to avoid contamination. For 95% of workgroups, workgroup is synonymous with site. We will stratify by the number of workgroups per site to ensure equal workgroups in each condition.

Data analytic strategies will follow the recommendations of Brown et al. [85] and the Prevention Science and Methodology Group for randomized field trials [86]. Preliminary data screening and cleaning will require examination of data distributions for normality and missing data patterns at both the univariate and multivariate levels. Once complete, hierarchical linear modeling (HLM) [87] will be used as the primary statistical model due to the nested structure of the data: repeated measures (time) (level-1) nested within service providers (level-2) nested within workgroups (level 3). Because randomization is at the workgroup level, and due to the statistical complications associated with the small number of agencies and a number of workgroups being located at the same site, dummy-coded agency and site variables will be entered at the level of the workgroups for all analyses, which implicitly controls for agency and site effects [88]. Random effect variance will be evaluated for the remaining levels of the nested data structure. In the presence of near zero estimates of variance estimates, terms will be removed from the statistical model.

HLM analyses for aims 1 and 2 will test specific intervention effects (LOCI vs. webinar control condition) on target outcomes (e.g., leadership behaviors, implementation climate). Of primary interest in these analyses is the Time (level-1) × Intervention Status (level-3) cross-level interaction effect. If the cross-level interaction term(s) are statistically significant, follow-up examination of these effects will follow the procedures of Preacher, Curran, and Bauer [89]. Implementation of cross-classified models will be explored given that some shifting or personnel may occur during the course of the study [90].

Aim 3 will test fidelity. The statistical methods used will be based on the HLM models described for the first two aims; however, since measurements for this outcome are averaged over time, there will not be a modeled covariance structure. For count outcomes related to implementation process (workgroup level), we will use a negative binomial regression model instead of a Poisson regression model to allow for the possibility of over-dispersion.

Finally, mediational effects described in aim 4 will use procedures for multilevel data structures outlined in MacKinnon [91]. For all analyses, covariates will be considered at both the provider and workgroup level. Inclusion of confounder variables at the provider level is important given that randomization to LOCI vs. control condition will not occur at this level. Inclusion of covariates at the workgroup level will adjust for chance imbalances and increase the precision of the analyses.

Discussion

LOCI has been developed to be feasible and effective for organizations to create a positive climate and fertile context for EBP implementation. The approach seeks to cultivate and sustain both effective general and implementation leadership in conjunction with organizational strategies and supports that will remain after the implementation strategy has ended. LOCI’s strategic in-person training combined with brief weekly coaching minimizes burden and promotes ongoing cognitive processing (i.e., being mindful of leadership and implementation issues) and enables ongoing and repeated efforts at behavior change. Effective leadership in health and allied health services is associated with more positive staff attitudes toward adopting EBP, provider adoption of EBPs, improved staff work attitudes, and performance [51]. Effective leadership is also critical in the successful and sustainable implementation of innovation [31, 92]. LOCI builds on these findings to garner broader organizational stuructures and processes to support implementation and sustainment.

Most agencies already provide manager trainings, but these tend to be more superficial and lack follow-up to support leader and organizational behavior change. There are many off-the-shelf leadership development programs, but such programs may lack utility because (1) they tend to be broad in scope and not designed to develop strategic climates for EBP, (2) are often not based on empirically supported approaches and curricula, (3) are often time-limited with little or no follow-up and inconsistent with learning theory that suggests interventions distributed over time and coupled with coaching and practice are more effective [93], and (4) are rarely tested for evidence of effectiveness or practical utility. The work proposed in this study addresses each of these limitations by (1) focusing on developing strategic climate for EBP in SUDT services, (2) utilizing an empirically based curriculum with measurable outcomes [94], (3) combining didactic and interactive training with ongoing coaching to support learning and behavior change, (4) linking first-level leader development with targeted organizational support strategies, and (5) empirically testing LOCI to determine its effects on proximal (leadership behaviors, implementation climate) and distal (provider attitudes and behaviors, fidelity, implementation process) outcomes.

While many SUDT providers are trained in MI, few deliver MI with fidelity. This study addresses a major gap in the way in which EBPs are “implemented” in usual care SUDT settings. It is not enough to “train and hope” that EBPs will be delivered with fidelity and in a manner that will lead to improved patient outcomes. What is needed are more comprehensive approaches to changing the context of community-based services to be ready, willing, and able to implement appropriate evidence-based service models to improve patient outcomes. The LOCI intervention holds promise in regard to these goals.

Declarations

Acknowledgements

This study is supported by the National Institute on Drug Abuse grant, R01DA038466, and developmental work was supported by the National Institute of Mental Health grants R21MH082731 (PI: Aarons), R21MH098124 (PI: Ehrhart), and R25MH080916 (PI: Proctor) and by the Center for Organizational Research on Implementation and Leadership (CORIL). The authors thank the community-based organizations, counselors, case managers, and supervisors that collaborate with us to make this work possible.

Funding

The funding for this project is provided by the US National Institute on Drug Abuse (NIDA) grant number R01DA038466. The opinions expressed herein are the views of the authors and do not necessarily reflect the official policy or position of the National Institute on Drug Abuse or any other part of the US Department of Health and Human Services.

Availability of data and materials

The datasets used and/or analyzed during the current study will be available from the corresponding author on reasonable request. MI training materials and process are the property of Behavioral Change Consultants.

Authors’ contributions

GAA, MGH, AEG, and EMT conceptualized this study, and GAA and MGH wrote the grant proposal. JCM and GAA drafted the manuscript. All other authors reviewed and edited the manuscript. All authors read and approved the final manuscript.

Competing interests

GAA is an associate editor of Implementation Science; all decisions on this paper were made by another editor. The authors declare that they have no other competing interests.

Consent for publication

Not applicable.

Ethics approval and consent to participate

This study was approved by the University of California, San Diego, Human Research Protections Program (HRPP; Protocol 141134). Informed consent will be obtained from all relevant participants.

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Authors’ Affiliations

(1)
Department of Psychiatry, University of California, San Diego
(2)
Child and Adolescent Services Research Center
(3)
Department of Psychology, San Diego State University

References

  1. Aarons GA, Ehrhart MG, Farahnak LR, Hurlburt MS. Leadership and organizational change for implementation (LOCI): a randomized mixed method pilot study of a leadership and organization development intervention for evidence-based practice implementation. Implement Sci. 2015;10:11.View ArticlePubMedPubMed CentralGoogle Scholar
  2. Schein E. Organizational culture and leadership. San Francisco: John Wiley and Sons; 2010.Google Scholar
  3. Dadich A. From bench to bedside: methods that help clinicians use evidence-based practice. Aust Psychol. 2010;45:197–211.View ArticleGoogle Scholar
  4. Aarons GA, Sawitzky AC. Organizational culture and climate and mental health provider attitudes toward evidence-based practice. Psychol Serv. 2006;3:61–72.View ArticlePubMedPubMed CentralGoogle Scholar
  5. Glisson C, Schoenwald SK, Hemmelgarn A, Green P, Dukes D, Armstrong KS, Chapman JE. Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. J Consult Clin Psychol. 2010;78:537–50.View ArticlePubMedPubMed CentralGoogle Scholar
  6. Parmelli E, Flodgren G, Beyer F, Baillie N, Schaafsma ME, Eccles MP. The effectiveness of strategies to change organisational culture to improve healthcare performance: a systematic review. Implement Sci. 2011;6:33.View ArticlePubMedPubMed CentralGoogle Scholar
  7. Sobo EJ, Bowman C, Aarons GA, Asch S, Gifford AL. Enhancing organizational change and improvement prospects: lessons from an HIV testing intervention for veterans. Hum Organ. 2008;67:443–53.View ArticleGoogle Scholar
  8. Weiner BJ, Belden CM, Bergmire DM, Johnston M. The meaning and measurement of implementation climate. Implement Sci. 2011;6:78–89.View ArticlePubMedPubMed CentralGoogle Scholar
  9. Green AE, Albanese BJ, Cafri G, Aarons GA. Leadership, organizational climate, and working alliance in a children’s mental health service system. Community Ment Health J. 2014;50:771–7.View ArticlePubMedGoogle Scholar
  10. De Hoogh AHB, Den Hartog DN, Koopman PL, Thierry H, Van Den Berg PT, Van Der Weide JG, Wilderom CPM. Leader motives, charismatic leadership, and subordinates’ work attitude in the profit and voluntary sector. Leadersh Q. 2005;16:17–38.View ArticleGoogle Scholar
  11. Glisson C, Durick M. Predictors of job satisfaction and organizational commitment in human service organizations. Adm Sci Q. 1988;33:61–81.View ArticleGoogle Scholar
  12. Aarons GA. Transformational and transactional leadership: association with attitudes toward evidence-based practice. Psychiatr Serv. 2006;57:1162–9.View ArticlePubMedPubMed CentralGoogle Scholar
  13. Aarons GA, Sommerfeld DH. Leadership, innovation climate, and attitudes toward evidence-based practice during a statewide implementation. J Am Acad Child Psychiatry. 2012;51:423–31.View ArticleGoogle Scholar
  14. Corrigan PW, Lickey SE, Campion J, Rashid F. Mental health team leadership and consumers’ satisfaction and quality of life. Psychiatr Serv. 2000;51:781–5.View ArticlePubMedGoogle Scholar
  15. Griffith TL, Zammuto RF, Aiman-Smith L. Why new technologies fail: overcoming the invisibility of implementation. Ind Manag. 1999;41:29–34.Google Scholar
  16. Klein KJ, Knight AP. Innnovation implementation: overcoming the challenge. Curr Dir Psychol Sci. 2005;14:243–6.View ArticleGoogle Scholar
  17. Rizzuto TE, Reeves J. A multidisciplinary meta-analysis of human barriers to technology implementation. Consult Psychol J: Pract Res. 2007;59:226–40.View ArticleGoogle Scholar
  18. Demakis JG, McQueen L, Kizer KW, Feussner JR. Quality enhancement research initiative (QUERI): a collaboration between research and clinical practice. Med Care. 2000;38:17–25.View ArticleGoogle Scholar
  19. Ford JH, Green CA, Hoffman KA, Widsdom JP, Riley KJ, Bergmann L, Molfenter T. Process improvement needs in substance abuse treatment: admissions walk-through results. J Subst Abuse Treat. 2007;33:379–89.View ArticlePubMedPubMed CentralGoogle Scholar
  20. McCarty D, Gustafson DH, Wisdom JP, Ford J, Choi D, Molfenter T, Capoccia V, Cotter F. The Network for the Improvement of Addiction Treatment (NIATx): enhancing access and retention. Drug Alcohol Depend. 2007;88:138–45.View ArticlePubMedGoogle Scholar
  21. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38:4–23.View ArticlePubMedGoogle Scholar
  22. Hoffman KA, Green CA, Ford II JH, Wisdom JP, Gustafson DH, McCarty D. Improving quality of care in substance abuse treatment using five key process improvement principles. J Behav Health Serv Res. 2012;39:234–44.View ArticlePubMedPubMed CentralGoogle Scholar
  23. Aarons GA, Ehrhart MG, Farahnak LR. The Implementation Leadership Scale (ILS): development of a brief measure of unit level implementation leadership. Implement Sci. 2014;9:45.View ArticlePubMedPubMed CentralGoogle Scholar
  24. Bass BM, Avolio BJ. The implications of transformational and transactional leadership for individual, team, and organizational development. In: Pasmore W, Woodman RW, editors. Research in organizational change and development. Greenwich: JAI Press; 1990. p. 231–72.Google Scholar
  25. Judge TA, Piccolo RF. Transformational and transactional leadership: a meta-analytic test of their relative validity. J Appl Psychol. 2004;89:755–68.View ArticlePubMedGoogle Scholar
  26. Avolio BJ, Bass BM, Jung DI. Re-examining the components of transformational and transactional leadership using the Multifactor Leadership Questionnaire. J Occup Organ Psychol. 1999;72:441–62.View ArticleGoogle Scholar
  27. Bass BM. Does the transactional-transformational leadership paradigm transcend organizational and national boundaries. Am Psychol. 1997;52:130–9.View ArticleGoogle Scholar
  28. Bass BM, Avolio BJ, Jung DI, Berson Y. Predicting unit performance by assessing transformational and transactional leadership. J Appl Psychol. 2003;88:207–18.View ArticlePubMedGoogle Scholar
  29. Aarons GA, Ehrhart MG, Torres EM, Finn NK, Roesch SC. Validation of the Implementation Leadership Scale (ILS) in substance use disorder treatment organizations. J Subst Abuse Treat. 2016;68:31–5.View ArticlePubMedGoogle Scholar
  30. Ehrhart MG, Aarons GA, Farahnak LR. Assessing the organizational context for EBP implementation: the development and validity testing of the Implementation Climate Scale (ICS). Implement Sci. 2014;9:157.View ArticlePubMedPubMed CentralGoogle Scholar
  31. Klein KJ, Conn AB, Sorra JS. Implementing computerized technology: an organizational analysis. J Appl Psychol. 2001;86:811–24.View ArticlePubMedGoogle Scholar
  32. Schneider B. Organizational climate and culture. San Francisco: Jossey-Bass; 1990.Google Scholar
  33. Ehrhart MG, Schneider B, Macey WH. Organizational climate and culture: an introduction to theory, research, and practice. New York: Routledge; 2014.Google Scholar
  34. Klein KJ, Conn AB, Smith DB, Sorra JS. Is everyone in agreement? An exploration of within-group agreement in employee perceptions of the work environment. J Appl Psychol. 2001;86:3–16.View ArticlePubMedGoogle Scholar
  35. Aarons GA, Palinkas LA. Implementation of evidence-based practice in child welfare: service provider perspectives. Adm Policy Ment Heallth. 2007;34:411–9.View ArticleGoogle Scholar
  36. O’Reilly CA, Caldwell DF, Chatman JA, Lapiz M, Self W. How leadership matters: the effects of leaders’ alignment on strategy implementation. Leadersh Q. 2010;21:104–13.View ArticleGoogle Scholar
  37. Chamberlain P, Brown CH, Saldana L. Observational measure of implementation progress in community based settings: the stages of implementation completion (SIC). Implement Sci. 2011;6:116–23.View ArticlePubMedPubMed CentralGoogle Scholar
  38. Saldana L. The stages of implementation completion for evidence-based practice: protocol for a mixed methods study. Implement Sci. 2014;9:43.View ArticlePubMedPubMed CentralGoogle Scholar
  39. Podsakoff PM, Todor WD. Relationships between leader reward and punishment behavior and group processes and productivity. J Manage. 1985;11:55–73.Google Scholar
  40. Huang L, Macbeth G, Dodge J, Jacobstein D. Transforming the workforce in children’s mental health. Adm Policy Ment Health. 2004;32:167–87.View ArticlePubMedGoogle Scholar
  41. Rogers EM. Diffusions of innovations. 4th ed. New York: The Free Press; 1995.Google Scholar
  42. Priestland A, Hanig R. Developing first-level leaders. Harv Bus Rev. 2005;83:112–20.PubMedGoogle Scholar
  43. Ash J. Organizational factors that influence information technology diffusion in academic health sciences centers. J Am Med Inform Assoc. 1997;4:102–11.View ArticlePubMedPubMed CentralGoogle Scholar
  44. Aarons GA, Sommerfeld DH, Walrath-Greene CM. Evidence-based practice implementation: the impact of public versus private sector organization type on organizational support, provider attitudes, and adoption of evidence-based practice. Implement Sci. 2009;4:83–96.View ArticlePubMedPubMed CentralGoogle Scholar
  45. Birleson P. Turning child and adolescent mental-health services into learning organizations. Clinl Child Psychol Psychiatry. 1999;4:265–74.View ArticleGoogle Scholar
  46. Eisenberger R, Fasolo P, Davis-LaMastro V. Perceived organizational support and employee diligence, commitment, and innovation. J Appl Psychol. 1990;75:51–9.View ArticleGoogle Scholar
  47. Shortell SM. Developing individual leaders is not enough. J Health Serv Res Policy. 2002;7:193–4.View ArticlePubMedGoogle Scholar
  48. Damschroder L, Aron D, Keith R, Kirsh S, Alexander J, Lowery J. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50–64.View ArticlePubMedPubMed CentralGoogle Scholar
  49. Ferlie EB, Shortell SM. Improving the quality of health care in the United Kingdom and the United States: a framework for change. Milbank Q. 2001;79:281–315.View ArticlePubMedPubMed CentralGoogle Scholar
  50. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82:581–629.View ArticlePubMedPubMed CentralGoogle Scholar
  51. Aarons GA, Ehrhart MG, Farahnak LR, Sklar M. Aligning leadership across systems and organizations to develop a strategic climate for evidence-based practice implementation. Annu Rev Public Health. 2014;35:255–74.View ArticlePubMedPubMed CentralGoogle Scholar
  52. Miller WR. Motivational interviewing with problem drinkers. Behav Psychother. 1983;11:147–72.View ArticleGoogle Scholar
  53. Miller WR, Rollnick S. Motivational interviewing: preparing people for change. 2nd ed. New York: Guilford Press; 2002.Google Scholar
  54. Henman MJ, Butow PN, Brown RF, Boyle F, Tattersall MHN. Lay constructions of decision-making in cancer. Psychooncology. 2002;11:295–306.View ArticlePubMedGoogle Scholar
  55. Jahng KH, Martin LR, Golin CE, DiMatteo MR. Preferences for medical collaboration: patient-physician congruence and patient outcomes. Patient Educ Couns. 2005;57:308–14.View ArticlePubMedGoogle Scholar
  56. Kaplan SH, Greenfield S, Ware Jr JE. Assessing the effects of physician-patient interactions on the outcomes of chronic disease. Med Care. 1989;27:110.View ArticleGoogle Scholar
  57. Ong LM, De Haes JC, Hoos AM, Lammes FB. Doctor-patient communication: a review of the literature. Soc Sci Med. 1995;40:903–18.View ArticlePubMedGoogle Scholar
  58. Oates J, Weston WW, Jordan J. The impact of patient-centered care on outcomes. J Fam Pract. 2000;49:796–804.PubMedGoogle Scholar
  59. Trummer UF, Mueller UO, Nowak P, Stidl T, Pelikan JM. Does physician-patient communication that aims at empowering patients improve clinical outcome? A case study. Patient Educ Couns. 2006;61:299–306.View ArticlePubMedGoogle Scholar
  60. Street RL, Piziak VK, Carpenter WS, Herzog J, Hejl J, Skinner G, McLellan L. Provider-patient communication and metabolic control. Diabetes Care. 1993;16:714–21.View ArticlePubMedGoogle Scholar
  61. Fields S. Characteristics associated with retention among African American and Latino adolescent HIV-positive men: results from the outreach, care, and prevention to engage HIV-seropositive young MSM of color special project of national significance initiative. J Acquir Immune Defic Syndr. 2010;53:529–36.View ArticlePubMedGoogle Scholar
  62. Street Jr RL, Gordon HW, Haidet P. Physicians’ communication and perceptions of patients: is it how they look, how they talk, or is it just the doctor? Soc Sci Med. 2007;65:586–98.View ArticlePubMedPubMed CentralGoogle Scholar
  63. Baker A, Kochan NN, Dixon JJ, Heather NN, Wodak AA. Controlled evaluation of a brief intervention for HIV prevention among injecting drug users not in treatment. AIDS Care. 1994;6:559–70.View ArticlePubMedGoogle Scholar
  64. Vasilaki E, Hosier SG, Cox WM. The efficacy of motivational interviewing as a brief intervention for excessive drinking: a meta-analytic review. Alcohol Alcohol. 2006;41:328–35.View ArticlePubMedGoogle Scholar
  65. Aarons GA, Ehrhart MG, Horowitz JD. Leadership to facilitate evidence-based practice implementation in healthcare organizations. In: Academy of Management Annual Meeting; Montreal. 2010.Google Scholar
  66. Balas EA, Boren SA. Managing clinical knowledge for healthcare improvements. In: Bemmel J, McCray AT, editors. Yearbook of medical informatics 2000: patient-centered systems. Stuttgart: Schattauer Verlagsgesellschaft; 2000. p. 65–70.Google Scholar
  67. Tooley EM, Moyers TB. Motivational interviewing in practice. In: Walters ST, Rotgers F, editors. Treating substance abuse: theory and technique. 3rd ed. New York: Guilford Press; 2011. p. 28–47.Google Scholar
  68. Miller WR, Rollnick S. Ten things that motivational interviewing is not. Behav Cogn Psychother. 2009;37:129–40.View ArticlePubMedGoogle Scholar
  69. Ehrhart MG, Aarons GA, Farahnak LR. Going above and beyond for implementation: the development and validity testing of the Implementation Citizenship Behavior Scale (ICBS). Implement Sci. 2015;10:65.View ArticlePubMedPubMed CentralGoogle Scholar
  70. Bass BM, Avolio BJ. MLQ: Multifactor Leadership Questionnaire. Redwood City: Mind Garden; 1995.Google Scholar
  71. Aarons GA, Sommerfeld DH, Willging CE. The soft underbelly of system change: the role of leadership and organizational climate in turnover during statewide behavioral health reform. Psychol Serv. 2011;8:269–81.View ArticlePubMedPubMed CentralGoogle Scholar
  72. Edmondson AC. Psychological safety and learning behavior in work teams. Adm Sci Q. 1999;44:350–83.View ArticleGoogle Scholar
  73. Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: the Evidence-Based Practice Attitude Scale (EBPAS). Ment Health Serv Res. 2004;6:61–74.View ArticlePubMedPubMed CentralGoogle Scholar
  74. Aarons GA, Glisson C, Hoagwood K, Kelleher K, Landsverk J, Cafri G. Psychometric properties and United States national norms of the Evidence-Based Practice Attitude Scale (EBPAS). Psychol Assess. 2010;22:356–65.View ArticlePubMedGoogle Scholar
  75. Aarons GA, Glisson C, Green PD, Hoagwood K, Kelleher KJ, Landsverk JA, The Research Network on Youth Mental Health. The organizational social context of mental health services and clinician attitudes toward evidence-based practice: a United States national study. Implement Sci. 2012;7:56–70.View ArticlePubMedPubMed CentralGoogle Scholar
  76. Hofmann DA, Morgeson FP, Gerras SJ. Climate as a moderator of the relationship between leader-member exchange and content specific citizenship: safety climate as an exemplar. J Appl Psychol. 2003;88:170–8.View ArticlePubMedGoogle Scholar
  77. Naar-King S, Suarez M, editors. Motivational Interviewing with adolescents and young adults. New York: Guilford Press; 2011.Google Scholar
  78. Moyers TB, Rowell LN, Manuel JK, Ernst D, Houck JM. The motivational interviewing treatment integrity code (MITI 4): Rationale, preliminary reliability and validity. J Subst Abuse Treat. 2016;65:36–42.View ArticlePubMedGoogle Scholar
  79. D’Amico EJ, Osilla KC, Miles JN, Ewing B, Sullivan K, Katz K, Hunter SB. Assessing motivational interviewing integrity for group interventions with adolescents. Psychol Addict Behav. 2012;26:994–1000.View ArticlePubMedPubMed CentralGoogle Scholar
  80. American Educational Research Association. American Psychological Association, National Council on Measurement in Education, Joint Committee on Standards for Educational, Psychological Testing: standards for educational and psychological testing. Washington: American Educational Research Association; 1999.Google Scholar
  81. Wilson M. Constructing measures: an item response modeling approach. Mahwah: Lawrence Erlbaum Associates, Inc; 2005.Google Scholar
  82. Chamberlain P, Snowden LR, Padgett C, Saldana L, Roles J, Holmes L, Landsverk J. A strategy for assessing costs of implementing new practices in the child welfare system: adapting the English cost calculator in the United States. Adm Policy Ment Health. 2011;38:24–31.View ArticlePubMedPubMed CentralGoogle Scholar
  83. Saldana L, Chamberlain P, Wang W, Brown CH. Predicting program start-up using the stages of implementation measure. Adm Policy Ment Health. 2012;39:419–25.View ArticlePubMedPubMed CentralGoogle Scholar
  84. Brown CH, Chamberlain P, Saldana L, Wang W, Cruden G. Evaluation of two implementation strategies in fifty-one counties in two states: results of a cluster randomized implementation trial. Implement Sci. 2014;9:134.Google Scholar
  85. Brown CH, Kellam SG, Kaupert S, Muthen B, Wang W, Muthen LK, Chamberlain P, PoVey CL, Cady R, Valente TW, et al. Partnerships for the design, conduct, and analysis of effectiveness, and implementation research: experiences of the prevention science and methodology group. Adm Policy Ment Health. 2012;39:301–16.View ArticlePubMedPubMed CentralGoogle Scholar
  86. Brown CH, Wang W, Kellam SG, Muthen BO, Petras H, Toyinbo P, Poduska J, Ialongo N, Wyman PA, Chamberlain P, et al. Methods for testing theory and evaluating impact in randomized field trials: intent-to-treat analyses for integrating the perspectives of person, place, and time. Drug Alcohol Depend. 2008;95:74–104.View ArticleGoogle Scholar
  87. Raudenbush SW, Bryk AS. Hierarchical linear models: applications and data analysis methods. Thousand Oaks: Sage Publications; 2002.Google Scholar
  88. Allison PD. Fixed effects regression models. Thousand Oaks: Sage Publications; 2009.View ArticleGoogle Scholar
  89. Preacher KJ, Curran PJ, Bauer DJ. Computational tools for probing interactions in multiple linear regression, multilevel modeling, and latent curve analysis. J Educ Behav Stat. 2006;31:437–48.View ArticleGoogle Scholar
  90. Beretvas SN. Cross-classified and multiple-membership models. In: Hox JJ, Roberts JK, editors. Handbook for advanced multilevel analysis. New York: Routledge/Taylor; 2011. p. 313–34.Google Scholar
  91. MacKinnon DP. Introduction to statistical mediation analysis. New York: Erlbaum; 2008.Google Scholar
  92. Edmondson AC. Framing for learning: lessons in successful technology implementation. Calif Manage Rev. 2003;45:35–54.View ArticleGoogle Scholar
  93. Goldstein IL, Ford JK. Training in organizations. 4th ed. Belmont: Wadsworth Thomson Learning; 2002.Google Scholar
  94. Howell JM, Hall-Merenda KE. The ties that bind: the impact of leader-member exchange, transformational and transactional leadership, and distance on predicting follower performance. J Appl Psychol. 1999;84:680–94.View ArticleGoogle Scholar

Copyright

© The Author(s). 2017

Advertisement