Skip to content

Advertisement

Open Access

The impact of inter-organizational alignment (IOA) on implementation outcomes: evaluating unique and shared organizational influences in education sector mental health

  • Aaron R. Lyon1Email author,
  • Kelly Whitaker1,
  • Jill Locke1,
  • Clayton R. Cook2,
  • Kevin M. King1,
  • Mylien Duong1,
  • Chayna Davis1,
  • Mark D. Weist3,
  • Mark G. Ehrhart4 and
  • Gregory A. Aarons5, 6
Implementation Science201813:24

https://doi.org/10.1186/s13012-018-0721-1

Received: 19 January 2018

Accepted: 30 January 2018

Published: 7 February 2018

Abstract

Background

Integrated healthcare delivered by work groups in nontraditional service settings is increasingly common, yet contemporary implementation frameworks typically assume a single organization—or organizational unit—within which system-level processes influence service quality and implementation success. Recent implementation frameworks predict that inter-organizational alignment (i.e., similarity in values, characteristics, activities related to implementation across organizations) may facilitate the implementation of evidence-based practices (EBP), but few studies have evaluated this premise. This study’s aims examine the impact of overlapping organizational contexts by evaluating the implementation contexts of externally employed mental health clinicians working in schools—the most common integrated service delivery setting for children and adolescents. Aim 1 is to estimate the effects of unique intra-organizational implementation contexts and combined inter-organizational alignment on implementation outcomes. Aim 2 is to examine the underlying mechanisms through which inter-organizational alignment facilitates or hinders EBP implementation.

Methods/design

This study will conduct sequential, exploratory mixed-methods research to evaluate the intra- and inter-organizational implementation contexts of schools and the external community-based organizations that most often employ school-based mental health clinicians, as they relate to mental health EBP implementation. Aim 1 will involve quantitative surveys with school-based, externally-employed mental health clinicians, their supervisors, and proximal school-employed staff (total n = 120 participants) to estimate the effects of each organization’s general and implementation-specific organizational factors (e.g., climate, leadership) on implementation outcomes (fidelity, acceptability, appropriateness) and assess the moderating role of the degree of clinician embeddedness in the school setting. Aim 2 will explore the mechanisms through which inter-organizational alignment influences implementation outcomes by presenting the results of Aim 1 surveys to school-based clinicians (n = 30) and conducting semi-structured qualitative interviews. Qualitative data will be evaluated using an integrative inductive and deductive approach.

Discussion

The study aims are expected to identify intra- and inter-organizational constructs that are most instrumental to EBP implementation success in school-based integrated care settings and illuminate mechanisms that may account for the influence of inter-organizational alignment. In addition to improving school-based mental health, these findings will spur future implementation science that considers the relationships across organizations and optimize the capacity of implementation science to guide practice in increasingly complex systems of care.

Keywords

SchoolsImplementationMental healthInner organizational contextOrganizational climateInter-organizational alignmentIntegrated care

Background

Integrated care—defined as care by a team of health professionals, often in non-traditional settings (e.g., primary care, schools) [1]—is becoming increasingly common in adult and youth health and mental health services [2]. In the USA, this growth has been due, in part, to the considerable financial investment of the Affordable Care Act [3]. Similar legislation and policy directives have promoted integrated services internationally [4]. Integrated services have the potential to increase service accessibility and efficiency [5], particularly for the most vulnerable children and adolescents who are disproportionally impacted by fragmented mental health services [69]. Nevertheless, the services provided in most integrated care settings could be substantially improved, as they seldom include the delivery of evidence-based practices (EBP) and have failed to consistently show advantages over more traditional services [1012].

Organizational factors are critical to successful implementation of EBP [1315], but their unique manifestation in integrated care settings remains unexplored. Research has documented the impact of both general (molar) and implementation-specific organizational factors. Important molar organizational factors include organizational culture (i.e., shared values, beliefs, and implicit norms that guide behavior) and organizational climate (i.e., employee-shared perceptions of the work environment) [1517]. Implementation-specific organizational factors include implementation climate (i.e., staff’s shared perceptions of the extent to which EBP implementation is expected, supported, and rewarded by their organization), implementation leadership (i.e., the attributes and behaviors of leaders that support effective implementation), and implementation citizenship behavior (i.e., going beyond the “call of duty” to support implementation) [15, 1820]. However, most studies are limited to a single organization, and studies of multiple organizations do not examine alignment across organizations. The focus on single organizations does not align with the fundamental nature of integrated care, which often brings together previously disparate service systems with the goal of improving the accessibility and effectiveness of health services [21]. As such, research evaluating the ways integrated care organizations contribute individually and collectively to successful EBP implementation is both overdue and critical to comprehensive understanding of system-level influences.

Key organizational constructs

The Exploration, Preparation, Implementation, Sustainment (EPIS) framework [2] guides our conceptualization of relevant organizational characteristics and inter-organizational relationships. EPIS identifies critical factors in the inner (i.e., immediate organizational or unit setting where implementation of the innovation takes place) and outer (i.e., more distal settings beyond the immediate setting such as school district, state, federal, etc.) contexts that affect implementation across multiple phases. Inner context factors may be particularly critical to successful implementation and sustainment of EBP [2, 17, 22, 23]. Therefore, this study focuses on inner organizational factors such as general/molar culture and climate and implementation-specific organizational factors (i.e., implementation climate [24], implementation leadership [25], and implementation citizenship behavior). Although the EPIS framework also recognizes the importance of the inter-organizational context (connections among organizations or units of the outer and inner settings), few details are specified, reflecting the significant knowledge gap regarding the effects of overlapping organizational contexts on the uptake and delivery of EBP.

A clear understanding of inter-organizational factors that facilitate or hinder implementation is particularly critical in integrated care settings that involve providers who are affiliated with different yet overlapping organizations. EPIS suggests that implementation will be enhanced when overlapping organizations or organizational units exhibit similar values, characteristics, and activities that support implementation, but this prediction has been largely untested. For decades, empirical and practical attention was focused on coordination across service sectors, with variable and sometimes negative results [2628]. Although some studies showed that such coordination improved service access and outcomes [2931], others found that it negatively impacted quality of care [32]. Qualitative studies support EPIS predictions and suggest that, for organizations to work together successfully, implementation leaders and stakeholders should demonstrate similar core values, shared vision/priorities, and a collective commitment to make a difference [33, 34].

We conceptualize implementation-focused inter-organizational alignment (IOA) as the extent to which these commonalities manifest across organizations in key implementation constructs and across system or organizational levels. For example, regarding implementation climate, alignment in integrated care settings may manifest as shared expectations between organizations surrounding the delivery of high-quality services. Intra-organization consistency in the messages communicated to employees results in decreased employee confusion and facilitates employee internalization of organizational objectives [3539]. It follows that inter-organizational alignment in implementation climate would have similar effects for employees who work in integrated care settings. Importantly, although a number of qualitative studies have supported the importance of alignment in implementation climate for EBP adoption [34, 40, 41], its influence—and the relative influence of different organizational factors—have yet to be examined quantitatively.

As a second example, strong organizational leadership frequently emerges in qualitative studies as a critical component of successful inter-organizational collaborations [4042]. When the leaders of organizations that collaborate to provide integrated care both support and engage in behaviors that facilitate implementation, this is likely to create a synergistic climate of openness, support, and accountability. However, to our knowledge, there are no quantitative studies regarding the combined impact of implementation leadership from multiple organizations. This study will explore the impact of alignment of these and other organizational constructs on implementation.

Inter-organizational alignment in school-based mental health

Schools provide convenient access to services and reduce barriers to treatment typically seen in traditional outpatient settings. This is particularly true for vulnerable ethnic and economic minority groups [4346]. In part for this reason, school-based mental health (SBMH) programs have progressively grown in the USA and abroad [4749], to the point where studies consistently indicate that 50–80% of all youth mental health services are provided in schools [5052]. Despite the practicality and pervasiveness of SBMH, implementation of EBP in schools is highly variable, dramatically reducing the public health impact of these services [5355]. Limited research has evaluated intra-organizational influences on the implementation of EBP in SBMH [56]. Furthermore, as with most other integrated care settings, very little work has evaluated the influence of inter-organizational factors on the delivery of evidence-based SBMH services.

Most frequently, SBMH services are provided by clinicians who are trained and employed by community-based organizations (CBOs) that exist outside of the education system [57]. This can lead to substantial administrative and contextual differences between schools and mental health systems (e.g., training, funding, etc.) that may impact the provision of services [58]. For example, in looking at mental health staff employed by schools (e.g., school psychologists, counselors) as compared to staff from CBOs working in schools (e.g., clinical social workers, psychologists), there are significant differences in areas such as their ability to obtain consent for services, providing effective interventions, team functioning, and confidentiality standards [59]. This dual administrative relationship is one that makes SBMH particularly conducive to studying the impact of IOA and generalizable to other integrated service settings [60]. Figure 1 displays our conceptualization of the inter-organizational context for SBMH. A series of studies has provided initial support for the influences of both the school and CBO organizational contexts. For instance, CBO factors significantly impact participation in post-training consultation and, ultimately, implementation success, even when accounting for clinician-level variables (e.g., EBP knowledge, attitudes) [61]. There also is evidence to suggest that the organizational context of schools predicts the use of EBP in public schools, with subsequent qualitative findings suggesting that implementation outcomes may be a function of both the CBO and school organizational contexts [62]. As a whole, the existing research indicates that each context plays an important role in successful implementation in schools, but it is unknown whether overlapping contexts have an interactive effect on implementation outcomes [11, 63].
Fig. 1

Inter-organizational context and inter-organizational alignment in school mental health

Clinician school embeddedness

The relative influence of school and CBO contexts also is likely to be affected by the extent to which SBMH clinicians are strongly connected to and collaborate with school personnel, which we term organizational embeddedness. Some clinicians are intimately involved in school functions, while others see students in isolation from other aspects of the school [64, 65]. Further, social relationships can motivate colleagues to support new initiatives and demonstrate commitment even when implementation is difficult [40, 66]. Well-embedded clinicians are more likely to be connected with opinion leaders or other implementation facilitators. Because it reflects a more extensive social connection to the school and school-based personnel, high clinician embeddedness may strengthen the impact of the school context on implementation outcomes. This embeddedness is influenced by school level factors, such as principal support for mental health services, and district level factors such as memoranda of agreement between schools and community agencies that sanction CBO clinician involvement in teams and other functions [67]. To date, no research has examined the moderating role of embeddedness in an inter-organizational setting.

Mechanisms of influence

Improving the efficiency, speed, and sustainment of EBP implementation requires research on the mechanisms through which multiple organizations impact implementation outcomes, yet no empirical evidence on these mechanisms exists to date [2, 14, 68, 69]. However, candidate mechanisms for IOA at the organizational and individual levels can be identified to facilitate research, tailor implementations to context, or identify core components to increase the precision of implementation strategies [70]. When considering the partnership between two organizations, mechanisms may include group-level processes drawn from established organizational frameworks, such as inter-organizational fragmentation or cohesion [71, 72]. At the individual level, the recently articulated Research Domain Criteria (RDoC) [73, 74] may provide a template for identifying potential mechanisms that explain how inter-organizational processes influence the implementation behavior of individual employees. Although the RDoC were developed primarily to elucidate the processes through which psychopathology is developed or maintained, they provide a more general way of organizing factors that may explain the impact of environmental influences on individual outcomes. These include variables such as the impact of positive valence systems (e.g., reward learning, habit formation) that increase the use of new skills in response to consistent organizational supports (e.g., supportive leadership), negative valence systems (e.g., confusion and dread) that may result from misaligned culture or climate, or social processes (e.g., perception and understanding of others) that explain how consistent organizational norms or employee behaviors influence clinician engagement with EBP in well-aligned settings. The second aim of this study (below) will fill a substantial knowledge gap by exploring the underlying mechanisms through which inter-organizational factors facilitate or hinder implementation.

Objectives and aims

The long-term objective of this research is to facilitate a more detailed understanding of how inter-organizational linkages operate to facilitate—or impede—implementation of EBP in integrated settings. To achieve this objective, this study has two aims.

Aim 1

Conduct quantitative surveys with school-based, externally employed mental health clinicians, their supervisors, and proximal school staff (i.e., those whose roles give them familiarity with school-based mental health services) to estimate the effects of intra-organizational variables and inter-organizational alignment on implementation outcomes (fidelity, acceptability, appropriateness) and assess the moderating role of clinician embeddedness (i.e., degree to which the person is visible and interacts with others in the setting). Aim 1 will yield quantitative information about inter-organizational alignment to be further explored in Aim 2.

Aim 2

Explore the mechanisms through which inter-organizational alignment influences implementation outcomes by presenting the quantitative findings from Aim 1 to school-based clinicians and eliciting qualitative feedback regarding hypothesized mechanisms that may drive quantitative associations between organizational factors and implementation outcomes. Aim 2 will yield novel information about the pathways through which organizational processes in integrated settings influence the implementation outcomes assessed, contributing to the nascent literature on implementation mechanisms more generally [75] and advancing future research endeavors to tailor implementation strategies to a given context.

Methods/design

An observational design will be used to examine intra- and inter-organizational factors that influence successful implementation of EBP in SBMH, followed by a mixed-methods approach to elucidate mechanisms of influence. This will include (1) large-scale administration of a web-based survey to school and CBO personnel to estimate the effects of school and CBO organizational contexts, as well as their alignment, and (2) a mixed methods approach [33, 76] in which qualitative data are collected about quantitative data to identify the mechanisms linking inter-organizational characteristics and alignment to implementation outcomes.

Participants

Participants in both aims will be SBMH providers and proximally related professionals employed by CBOs and elementary, middle, and high schools located in two large urban school districts in the Midwest and Pacific Northwest. Proximal personnel within CBOs include SBMH clinicians and supervisors, whereas those in schools may include professionals such as school nurses, counselors, social workers, or administrators. Sites were selected for having longstanding SBMH programs (which are expected to yield a range of clinician embeddedness), administrative relationships with the schools reflecting the most common arrangements nationally (i.e., CBOs providing school-based services via a county contract), and substantial ethnic and economic diversity of the district student bodies (60%+ nonwhite; 45–65% low income). Based on our team’s prior research and the demographic makeup of the participating organizations, it is expected that school and CBO personnel will be approximately 80–85% female and 75% Caucasian, 10% African American, 8% Asian, and 10% Latino/a. The total sample will be 120 participants, including 30–35 clinicians and 90 school personnel.

Aim 1 design: quantitative evaluation of intra- and inter-organizational contexts

Procedures

To address Aim 1, measures will be collected via web-based survey. To ensure high response rates, backup data collection procedures include in-person surveys, mailed surveys, and telephone follow-up. All CBO-employed SBMH clinicians will be recruited from each site. To identify school staff with proximal roles to social, emotional, and behavioral programming in schools, survey administration will be sequential over two phases with school-based clinicians completing the first phase of data collection. During their survey completion, clinicians will identify staff from their schools who have roles that include supporting student social, emotional, and behavioral health (e.g., school nurse, school psychologist, school counselor, teacher lead for student behavioral supports). Investigators will then contact each individual by email and/or telephone to introduce the study and answer any questions. Consent will be obtained as part of the web survey.

Measures

Consistent with prior organizational research [77], we will obtain at least three employee ratings for each of the three general/molar and three implementation-specific organizational-level constructs from personnel most proximal to the delivery of school-based mental health services. Because they are embedded in both contexts, SBMH clinicians will complete measures for both CBOs and schools. To avoid sequencing biases, clinicians will be randomized as to whether they first complete questions about their CBO or their school. Scores will be aggregated across reporters to yield scores for each organization on organizational measures. As detailed below, the organizational implementation context of CBOs will be measured using well-established tools of the organizational social context and implementation-specific organizational factors. The school organizational context will be evaluated using versions of those tools that have previously been adapted for use in schools [56, 78]. Individual-level covariates (i.e., attitudes, citizenship) will also be collected using school-adapted instruments. Implementation outcomes will be gathered from SBMH clinicians and supervisors.

Demographics

A demographic survey for all participants (school and CBO) will include age, gender identity, race/ethnicity, education, years of training/terminal degree completion, primary discipline, experience, and work setting/activities performed.

Organizational social context (OSC)

The OSC is a measure of general/molar organizational culture and climate [15]. In the current study, only three subscales: Proficiency, Stress and Functionality will be administered; subscale α = 0.78–0.94.

Implementation Leadership Scale (ILS)

The ILS, a measure of implementation leadership [25, 79], has 12 items loading onto four subscales: Proactive Leadership, Knowledgeable Leadership, Supportive Leadership, and Perseverant Leadership; subscale α = 0.95–0.98.

Implementation Climate Scale (ICS)

The ICS, a measure of implementation climate [24, 80], includes 18 items loading onto a total score and six subscales: Focus on EBP, Educational Support for EBP, Recognition for EBP, Rewards for EBP, Selection for EBP, and Selection for Openness; subscale α = 0.81–0.91.

Implementation Citizenship Behavior Scale (ICBS)

The ICBS is a measure of an individual’s implementation citizenship behavior (i.e., the extent to which employees go “above and beyond the call of duty” for EBP implementation) [81] and consists of six items loading onto two subscales: Helping Others and Keeping Informed; subscale α = .94–.95 [56].

Attitudes toward evidence-based practices scale (EBPAS)

Clinician attitudes toward evidence-based practice will be measured using a version of the EBPAS [82] that was adapted for use with education sector service providers and consists of 16 items that load onto four subscales: Requirements, Appeal, Openness, and Divergence; subscale α = .59–.90.

Expanded school mental health collaboration instrument (ESCI)

The ESCI [64] will be used to evaluate both clinician embeddedness and mental health service quality. The Outreach and Approach by Mental Health Professionals subscale includes 13 items and assesses the level of embeddedness of community-mental health clinicians in schools. Improvements in school mental health delivery will be assessed with three additional subscales from the ESCI: Support for Teachers and Students (how students and teachers are supported through SBMH programming, 8 items); Increased Mental Health Programming, (improvements in services at school, 5 items); and Improved Access for Students and Families (3 items); subscale α = .84–.94.

Fidelity/integrity

Consistent with prior organizational research [81], implementation integrity will be evaluated via a modified set of four items completed independently by clinicians and supervisors. Integrity items address practices used with clients, competence, and adherence. Internal consistency for these ratings has been found to be high (α = .97) [81]. As final component of this measure, respondents also will provide information about the specific EBP that they have implemented in their work setting.

Acceptability and appropriateness

Acceptability and appropriateness of new EBP will be assessed using the four-item Acceptability of Intervention Measure (AIM) and Intervention Appropriateness Measure (IAM) [83], tailored to ask generally about evidence-based practices. The measures have been found to have good inter-item reliability (0.85 for acceptability, 0.91 for appropriateness) and test-retest reliability (0.83 for acceptability, 0.87 for appropriateness).

Analyses

Basic data screening procedures will be conducted to screen for errors and explore normality, linearity, form, and outliers. Data will be transformed as appropriate. Statistical analyses will be conducted using commonly available software packages (SPSS [84], Stata [85], Multilevel/Hierarchical Level Modeling [MLM/HLM] [86], and R [87]).

Inter-organizational alignment

IOA is the degree to which schools and CBOs exhibit similarity on organizational variables linked to implementation (e.g., general/molar organizational culture; implementation leadership). To describe variability in alignment, we will first use the intra-class correlation coefficient, which estimates variability in each variable that is common across schools and CBOs. Prior research has suggested that difference scores are not ideal as a measure of agreement because they provide no information about whether the effect of a difference depends on the level of either variable [88, 89]. Thus, IOA will be analyzed using polynomial regression with response surface analysis [90, 91] described below.

Modeling approach

A multilevel modeling (MLM) framework will allow us to account for the fact that approximately 35 SBMH clinicians in approximately 25–30 schools will be nested within 10–12 CBOs. Because this design will capture nearly all of the school-serving CBOs in two regions, analyses will provide near population level estimates of effects. Thus, analyses will focus on obtaining effect size estimates rather than on traditional hypothesis testing; these estimates will be used to inform the qualitative aims. We will use the estimated effects from these models to guide the development and design of qualitative interview guides addressing organizational factors found to most strongly predict implementation outcomes. We will first estimate the effects of each measure of the organizational implementation context (e.g., general/molar organizational climate, implementation climate) on each implementation outcome, using CBO and school organizational variables as unique predictors of clinician level implementation outcomes. We also will include individual variables (e.g., EBP attitudes) in the multilevel models as level 1 (i.e., clinician level) covariates, testing whether their inclusion improves model fit via likelihood-ratio testing. Next, we will examine the effects of IOA in two ways. First, for each variable, we will model an interaction between CBO and school context; the size of the interaction provides an estimate of the interdependent effects of the school and the CBO organizational variables. We also will examine the effects of alignment using the polynomial regression with response surface analysis approach [92], which examines nonlinearity of alignment effects (e.g., effects of alignment when organizational variables are rated similarly in both organizations) and allows a visual examination of the effects of alignment. Finally, we will estimate the effects of our organizational variables on embeddedness for each organization (school or community-based organization) independently, estimate the effects of inter-organizational alignment on embeddedness, and examine the moderating effects of embeddedness on the separate influences of the school and CBO contexts. Overall, this analytic approach will enable us to obtain effect size estimates of the inter-organizational factors that impact clinician implementation outcomes, which will be used as the foundation for the qualitative phase of the project.

Power

Our study is largely focused on estimating effects of an entire population rather than computing statistical significance or generalizing to a larger population and therefore potentially less susceptible to issues associated with calculating effect sizes in small samples [93, 94]. Nevertheless, we calculated power to detect effects in MLM by computing a “corrected sample size,” which adjusts the sample size by the correlation due to nesting [68, 95]. Assuming an ICC of .10 [96], our effective sample size to detect effects on clinician-level implementation outcomes with 35 clinicians would be n = 29. Thus, the proposed study has the sensitivity at power (1-β) = .80 (α = .05) to detect effects as small as f2 = .37.

Aim 2: identification of preliminary implementation mechanisms

Procedures

Qualitative and mixed-methods inquiry is critical to mechanistic research [97]. To address Aim 2, visual summaries of Aim 1 quantitative, organizational data will be presented to all SBMH clinician participants to evaluate the ways in which alignment/misalignment of the CBO and school contexts influences their implementation of new practices. Because the objective is to describe the mechanisms through which IOA exerts its influence on SBMH clinicians, the perspectives of participants who directly experience both organizations (i.e., SBMH clinicians) are most critical. Individual semi-structured phone interviews, lasting approximately 45–60 min, will be conducted at a convenient time for clinicians and audio-recorded. Recordings will be transcribed prior to coding. The mixed methods design will be sequential in structure (quantitative data in Aim 1 collected prior to qualitative data in Aim 2); the functions are explanation and expansion (we will use qualitative data to provide depth and breadth of understanding to explain the mechanisms that explain the influence of IOA on implementation, i.e., QUAN➔QUAL), and the process is connecting (the qualitative dataset will build on the quantitative dataset) [35, 76, 98, 99]. Qualitative data in Aim 2 will explore the nuances of inter-organizational processes to understand mechanisms that affect implementation.

Qualitative interview protocol

We will develop a systematic, comprehensive semi-structured interview guide that examines underlying mechanisms through which inter-organizational processes facilitate or hinder EBP implementation. Feedback reports will be presented to participants for their respective school-CBO dyad and will include all of the organizational constructs measured in Aim 1 (i.e., OSC, ILS, ICS). Using these constructs as an overarching framework, we will generate questions that explore the inter-organizational implementation context as perceived by CBO clinicians and supervisors. Specifically, a series of questions will (1) request examples of how alignment/misalignment on each construct manifests in their work settings and (2) explore the mechanisms/processes through which degree of alignment/misalignment affects their ability to implement EBP. Questions will be carefully constructed to elicit clear information without assigning valence to performance. We also will ask questions about clinician embeddedness to gain a more nuanced understanding of the implications of clinician connections to and collaborations with the schools in which they work. Synthesis of this information will point to potential mechanisms through which IOA impacts successful implementation.

Analyses

All interviews will be transcribed and imported into the software package NVivo [100]. Data will be coded using an integrated deductive and inductive approach as certain codes will be conceptualized during the interview guide development (i.e., deductive approach), and other codes will be developed through a close reading of an initial subset of transcripts (i.e., inductive approach) [101]. These themes will provide a way of identifying and understanding potential mechanisms through which IOA impacts implementation [102, 103]. Mechanisms can be both facilitative (i.e., those that result in better implementation) and inhibitory (i.e., those that impede implementation) and will be compared to a variety of individual and organizational mechanistic frameworks [71, 72, 74]. After a stable set of codes is developed, a consensus process will be used in which all reviewers independently code all of the transcripts and meet to compare their coding to arrive at consensus judgments through open dialog [104106]. Consensus coding is designed to capture data complexity, avoid errors, reduce groupthink, and circumvent some researcher biases. Mechanisms identified via qualitative analyses will be summarized separately for each organizational construct to determine the specific factors that facilitate or inhibit implementation based on alignment relationships.

Sample size/power

Based on our previous work [107], it is anticipated that nearly all clinician participants from Aim 1 also will participate in qualitative interviews, resulting in a sample size of approximately n = 30. Based on our previous research focused on organizational issues in schools [66, 108], we anticipate that this total will be adequate to achieve data saturation and capture the constructs of interest [109, 110].

Discussion

Innovation

This study will address significant gaps in implementation science knowledge and is innovative by simultaneously examining the organizational contexts of multiple settings (schools and CBOs providing mental health services in schools), linking inter-organizational alignment to implementation, assessing the role of clinician embeddedness, and evaluating the mechanisms through which these processes impact clinician implementation outcomes. Previous implementation research in SBMH has been almost entirely qualitative and has focused on how the organizational factors of either the school [111, 112] or the CBO [113, 114] individually affect implementation. Notwithstanding the contributions of this research, conducting research in silos does not capture the dynamic and interactive nature of organizations and the influence they have on the delivery of high-quality integrated care. No research has simultaneously examined the unique organizational contexts of both schools and CBOs involved in the implementation of mental health EBP. This neglects the dual administrative relationship that typifies SBMH, where services are most commonly provided by clinicians who are trained and employed outside of the education system [57]. Examining the impact of these organizations simultaneously not only has potential to improve implementation science in SBMH but may be generalizable to other integrated care settings.

Further, contemporary implementation frameworks typically assume a single organization within which system-level processes influence service quality and implementation success. This perspective does not represent the emerging realities of modern mental health care [3, 4] and inhibits what can be learned about organizational processes in implementation. This study is the first to examine the specific impact of IOA in an integrated care setting. Findings have the potential to spur future implementation science that considers the complex relationships across organizations, inform efforts to design and tailor implementation enhancement interventions that attend explicitly to the inter-organizational context, and optimize the capacity of implementation science to guide practice in increasingly complicated systems of care.

Despite growing emphasis on mechanisms of action in both intervention science [73, 74, 97] and implementation science [75], no research has evaluated the mechanisms through which overlapping organizational contexts impact implementation. This study will use mixed methods to advance the field’s nascent understanding of mechanisms by exploring the pathways through which IOA influences implementation outcomes. Explicitly evaluating these mechanisms represents a significant step forward for implementation research that seeks to develop more precise strategies that target the processes through which organizational factors impact implementation outcomes.

Finally, because the level of SBMH clinician embeddedness in schools reflects their communications, collaborations, and relationships with school staff [115], this study explores the moderating role of SBMH clinician embeddedness on implementation. In SBMH, there is considerable variability in the extent to which clinicians are connected to and collaborate with school personnel [64, 65]. This provides a unique opportunity to examine how organizational influences in general—and IOA in particular—might vary by level of embeddedness.

Limitations

First, the current project is designed to estimate the effects of inter- and intra-organizational variables on implementation outcomes but is not powered for traditional significance testing within a multilevel framework. Recent work shows that studies with less than optimal statistical power can produce important scientific findings and result in more scientific value per dollar spent than larger sample sizes [116]. Further, null hypothesis testing yields less information about the potential reproducibility of study findings than parameter estimates and accompanying confidence intervals [117, 118]. Consequently, we have framed our research questions and design—for both main and moderating effects—in terms of the magnitude, direction, and precision of effect estimates in order to determine the plausibility of effects. There are approximately 13 CBOs operating in the two participating school districts in which we will recruit participants. We intend to include 90–100% of them in the current study, providing an estimate of effects that approximates the population parameters for these regions.

Second, this study is cross-sectional. We considered a longitudinal design that would evaluate changes in organizational constructs (and their alignment) at multiple time points but determined that the limited literature on IOA, objective of estimating effects, and resource limitations were most conducive to a cross-sectional design. Third, although objective indicators of some implementation outcomes (e.g., integrity) may be ideal, objective outcomes that would be common across agencies, schools, and EBP are not available. Instead, subjective and generalizable individual- and team-level EBP implementation outcome measures were drawn from prior research conducted by members of the study team [24, 79]. These measures allow for responses from multiple informants, which have previously been found to overcome some self-report/common method biases [119, 120].

Conclusion and impact

Assessment of organizational processes in implementation has fallen behind the realities of contemporary integrated healthcare, which often spans multiple providers, settings, and organizations. Given the well-established importance of organizational processes for the success of implementation initiatives, simultaneous attention to multiple, overlapping settings is likely to become increasingly relevant. The sequential findings yielded from the current aims are intended to inform research both within and beyond the education sector as the field of implementation moves increasingly toward mechanistic inquiry as a pathway to identifying the most parsimonious and effective implementation strategies.

Abbreviations

AIM: 

Acceptability of Intervention Measure

CBO: 

Community-based organization

EBP: 

Evidence-based practice

EBPAS: 

Evidence-Based Practice Attitudes Scale

EPIS: 

Exploration, Preparation, Implementation, Sustainment

HLM: 

Hierarchical linear modeling

IAM: 

Intervention Appropriateness Measure

ICBS: 

Implementation Citizenship Behavior Scale

ICS: 

Implementation Climate Scale

ILS: 

Implementation Leadership Scale

IOA: 

Inter-organizational alignment

MLM: 

Multilevel modeling

OIC: 

Organizational implementation context

SBMH: 

School-based mental health

Declarations

Acknowledgements

The authors thank the schools and community-based organizations that have agreed to participate in this research. We also thank Shanon Cox and Elissa Picozzi for checking the manuscript references and formatting.

Funding

This project and publication are supported by National Institute of Mental Health (NIMH) grant R21MH110691 (PI: ARL). Additionally, the preparation of this article was supported in part by Institute of Education Sciences grant R305A160114 (PI: ARL), NIMH grant K01MH100199 (PI: JL), NIMH grants R01MH072961 and R01MH092950 (PI: GAA), and National Institute on Drug Abuse (NIDA) grant R01DA038466 (PI: GAA). The content in this article is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health or Institute of Education Sciences.

Availability of data and materials

Not applicable.

Authors’ contributions

ARL is project PI and leads the research team. ARL developed the original research grant in collaboration with KW, JL, CRC, MD, and KK and with consultation from GAA, MGE, and MDW. ARL developed the initial manuscript. CD is the project manager and edited the manuscript for clarity and consistency. All authors read and approved the final manuscript.

Ethics approval and consent to participate

This project has received approval from the first author’s Institutional Review Board (IRB), which determined the project to be exempt from review.

Consent for publication

Not applicable.

Competing interests

GAA is an Associate Editor of Implementation Science. All decisions on this manuscript were made by another editor. All other authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Authors’ Affiliations

(1)
University of Washington, Seattle, USA
(2)
University of Minnesota, Minneapolis, USA
(3)
University of South Carolina, Columbia, USA
(4)
University of Central Florida, Orlando, USA
(5)
University of California San Diego, San Diego, USA
(6)
Child and Adolescent Services Research Center, San Diego, USA

References

  1. Butler M, Kane RL, McAlpine D, Kathol RG, Fu SS, Hagedorn H, et al. Integration of mental health/substance abuse and primary care. Evid Reporttechnology Assess. 2008;173:1–362.Google Scholar
  2. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health Ment Health Serv Res. 2011;38:4–23.View ArticleGoogle Scholar
  3. Croft B, Parish SL. Care integration in the patient protection and affordable care act: implications for behavioral health. Adm Policy Ment Health Ment Health Serv Res. 2013;40:258–63.View ArticleGoogle Scholar
  4. WHO global strategy on people-centred and integrated health services. World Health Organization; 2015.Google Scholar
  5. Asarnow JR, Rozenman M, Wiblin J, Zeltzer L. Integrated medical-behavioral care compared with usual primary care for child and adolescent behavioral health: a meta-analysis. JAMA Pediatr. 2015;169:929–37.PubMedView ArticleGoogle Scholar
  6. Skowyra KR, Cocozza JJ. Blueprint for change: a comprehensive model for the identification and treatment of youth with mental health needs in contact with the juvenile justice system. Policy Res Assoc Inc. 2007;Google Scholar
  7. Kortenkamp K. The well-being of children involved with the child welfare system: a national overview. Urban Inst. 2002;B B-43.Google Scholar
  8. Burns BJ, Phillips SD, Wagner HR, Barth RP, Kolko DJ, Campbell Y, et al. Mental health need and access to mental health services by youths involved with child welfare: a national survey. J Am Acad Child Adolesc Psychiatry. 2004;43:960–70.PubMedView ArticleGoogle Scholar
  9. Skowyra K, Cocozza JJ. A blueprint for change: improving the system response to youth with mental health needs involved with the juvenile justice system. NY: National Center for Mental Health and Juvenile Justice Delmar; 2006.Google Scholar
  10. Bickman L. A continuum of care: more in not always better. Am Psychol. 1996;51:689.PubMedView ArticleGoogle Scholar
  11. Unützer J, Chan Y-F, Hafer E, Knaster J, Shields A, Powers D, et al. Quality improvement with pay-for-performance incentives in integrated behavioral health care. Am J Public Health. 2012;102:e41–5.PubMedPubMed CentralView ArticleGoogle Scholar
  12. Bickman L, Smith CM, Lambert EW, Andrade AR. Evaluation of a congressionally mandated wraparound demonstration. J Child Fam Stud. 2003;12:135–56.View ArticleGoogle Scholar
  13. Beidas RS, Kendall PC. Training therapists in evidence-based practice: a critical review of studies from a systems-contextual perspective. Clin Psychol Sci Pract. 2010;17:1–30.View ArticleGoogle Scholar
  14. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.PubMedPubMed CentralView ArticleGoogle Scholar
  15. Glisson C, Landsverk J, Schoenwald S, Kelleher K, Hoagwood KE, Mayberg S, et al. Assessing the organizational social context (OSC) of mental health services: implications for research and practice. Adm Policy Ment Health Ment Health Serv Res. 2008;35:98.View ArticleGoogle Scholar
  16. Glisson C, Green P. Organizational climate, services, and outcomes in child welfare systems. Child Abuse Negl. 2011;35:582–91.PubMedView ArticleGoogle Scholar
  17. Glisson C, Schoenwald SK, Hemmelgarn A, Green P, Dukes D, Armstrong KS, et al. Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. J Consult Clin Psychol. 2010;78:537–50.PubMedPubMed CentralView ArticleGoogle Scholar
  18. Organ DW, Podaskoff PM, MacKenzie SB. Organizational citizenship behavior: its nature, antecedents, and consequences. Sage Publ. 2005;Google Scholar
  19. Aarons GA, Glisson C, Green PD, Hoagwood K, Kelleher KJ, Landsverk JA. The organizational social context of mental health services and clinician attitudes toward evidence-based practice: a United States national study. Implement Sci. 2012;7:56.PubMedPubMed CentralView ArticleGoogle Scholar
  20. Hemmelgarn AL, Glisson C, James LR. Organizational culture and climate: implications for services and interventions research. Clin Psychol Sci Pract. 2006;13:73–89.View ArticleGoogle Scholar
  21. Shaw S, Rosen R, Rumbold B. What is integrated care? Lond Nuffield Trust: An overview of integrated care in the NHS; 2011.Google Scholar
  22. Beidas RS, Edmunds J, Ditty M, Watkins J, Walsh L, Marcus S, et al. Are inner context factors related to implementation outcomes in cognitive-behavioral therapy for youth anxiety? Adm Policy Ment Health Ment Health Serv Res. 2014;41:788–99.View ArticleGoogle Scholar
  23. Guerrero EG, He A, Kim A, Aarons GA. Organizational implementation of evidence-based substance abuse treatment in racial and ethnic minority communities. Adm Policy Ment Health Ment Health Serv Res. 2014;41:737–49.View ArticleGoogle Scholar
  24. Ehrhart MG, Aarons GA, Farahnak LR. Assessing the organizational context for EBP implementation: the development and validity testing of the implementation climate scale (ICS). Implement Sci IS. 2014;9:157.PubMedView ArticleGoogle Scholar
  25. Aarons G, Ehrhart M, Farahnak L, Sklar M. The role of leadership in creating a strategic climate for evidence-based practice implementation and sustainment in systems and organizations. Front Public Health Serv Syst Res. 2014;3:3.Google Scholar
  26. Lippitt R, Van Til J. Can we achieve a collaborative community? Issues, Imperatives, potentials. J Volunt Action Res. 1981.Google Scholar
  27. Stroul BA, Friedman RM. A system of care for severely emotionally disturbed children and youth. 1986. https://eric.ed.gov/?id=ED330167. Accessed 7 Dec 2017.Google Scholar
  28. Jones N, Thomas P, Rudd L. Collaborating for mental health services in wales: a process evaluation. Public Adm. 2004;82:109–21.View ArticleGoogle Scholar
  29. Chuang E, Wells R. The role of inter-agency collaboration in facilitating receipt of behavioral health services for youth involved with child welfare and juvenile justice. Child Youth Serv Rev. 2010;32:1814–22.PubMedPubMed CentralView ArticleGoogle Scholar
  30. Cottrell D, Lucey D, Porter I, Walker D. Joint working between child and adolescent mental health services and the department of social services: the Leeds model. Clin Child Psychol Psychiatry. 2000;5:481–9.View ArticleGoogle Scholar
  31. Bai Y, Wells R, Hillemeier MM. Coordination between child welfare agencies and mental health service providers, children’s service use, and outcomes. Child Abuse Negl. 2009;33:372–81.PubMedPubMed CentralView ArticleGoogle Scholar
  32. Glisson C, Hemmelgarn A. The effects organizational climate and interorganizational coordination on the quality and outcomes of children’s service systems. Child Abuse Negl. 1998;22:401–21.PubMedView ArticleGoogle Scholar
  33. Aarons GA, Fettes DL, Sommerfeld DH, Palinkas LA. Mixed methods for implementation research: application to evidence-based practice implementation and staff turnover in community-based organizations providing child welfare services. Child Maltreat. 2012;17:67–79.PubMedView ArticleGoogle Scholar
  34. Sobo EJ, Bowman C, Gifford AL. Behind the scenes in health care improvement: the complex structures and emergent strategies of implementation science. Soc Sci Med. 2008;67:1530–40.PubMedView ArticleGoogle Scholar
  35. Aarons GA, Ehrhart MG, Farahnak LR, Sklar M. Aligning leadership across systems and organizations to develop a strategic climate for evidence-based practice implementation. Annu Rev Public Health. 2014;35:255–74.PubMedPubMed CentralView ArticleGoogle Scholar
  36. Klein KJ, Sorra JS. The challenge of innovation implementation. Acad Manag Rev. 1996;21:1055–80.Google Scholar
  37. Weiner BJ, Belden CM, Bergmire DM, Johnston M. The meaning and measurement of implementation climate. Implement Sci. 2011;6:78.PubMedPubMed CentralView ArticleGoogle Scholar
  38. Ehrhart MG, Schneider B, Macey WH. Organizational climate and culture: an introduction to theory, research, and practice: Routledge; 2013.Google Scholar
  39. Zohar D, Luria G. A multilevel model of safety climate: cross-level relationships between organization and group-level climates. J Appl Psychol. 2005;90:616–28.PubMedView ArticleGoogle Scholar
  40. Palinkas LA, Fuentes D, Finno M, Garcia AR, Holloway IW, Chamberlain P. Inter-organizational collaboration in the implementation of evidence-based practices among public agencies serving abused and neglected youth. Adm Policy Ment Health Ment Health Serv Res. 2014;41:74–85.View ArticleGoogle Scholar
  41. Weinberg LA, Zetlin A, Shea NM. Removing barriers to educating children in foster care through interagency collaboration: a seven county multiple-case study. Child Welf Arlingt. 2009;88:77–111.Google Scholar
  42. Johnson P, Wistow G, Schulz R, Hardy B. Interagency and interprofessional collaboration in community care: the interdependence of structures and values. J Interprof Care. 2003;17:70–83.View ArticleGoogle Scholar
  43. Pullmann MD, Bruns EJ, Daly BP, Sander MA. Improving the evaluation and impact of mental health and other supportive school-based programmes on students’ academic outcomes. Adv School Ment Health Promot. 2013;6:226–30.View ArticleGoogle Scholar
  44. Pullmann MD, VanHooser S, Hoffman C, Heflinger CA. Barriers to and supports of family participation in a rural system of care for children with serious emotional problems. Community Ment Health J. 2010;46:211–20.PubMedView ArticleGoogle Scholar
  45. Kataoka S, Stein BD, Nadeem E, Wong M. Who gets care? Mental health service use following a school-based suicide prevention program. J Am Acad Child Adolesc Psychiatry. 2007;46:1341–8.PubMedView ArticleGoogle Scholar
  46. Lyon AR, Frazier SL, Mehta T, Atkins MS, Weisbach J. Easier said than done: intervention sustainability in an urban after-school program. Adm Policy Ment Health Ment Health Serv Res. 2011;38:504–17.View ArticleGoogle Scholar
  47. Weist MD, Bruns EJ, Whitaker K, Wei Y, Kutcher S, Larsen T, et al. School mental health promotion and intervention: experiences from four nations. Sch Psychol Int. 2017;38:343–62.View ArticleGoogle Scholar
  48. Weist MD, Lever NA, Bradshaw CP, Owens JS. Handbook of school mental health: research, training, practice, and policy. Springer Science & Business Media; 2013.Google Scholar
  49. Weist MD, Rowling L. International efforts to advance mental health in schools. Int J Ment Health Promot. 2002;4:3–7.View ArticleGoogle Scholar
  50. Burns B, Burns BJ, Costello EJ, Angold A, Tweed D, Stangl D, et al. Children’s mental health service use across service sectors. Health Aff Proj Hope. 1995;14:147–59.View ArticleGoogle Scholar
  51. Farmer EMZ, Burns BJ, Phillips SD, Angold A, Costello EJ. Pathways into and through mental health services for children and adolescents. Psychiatr Serv. 2003;54:60–6.PubMedView ArticleGoogle Scholar
  52. Kessler RC, Demler O, Frank RG, Olfson M, Pincus HA, Walters EE, et al. Prevalence and treatment of mental disorders, 1990 to 2003. N Engl J Med. 2005;352:2515–23.PubMedPubMed CentralView ArticleGoogle Scholar
  53. Owens JS, Lyon AR, Brandt NE, Warner CM, Nadeem E, Spiel C, et al. Implementation science in school mental health: key constructs in a developing research agenda. Sch Ment Heal. 2014;6:99–111.View ArticleGoogle Scholar
  54. Rones M, Hoagwood K. School-based mental health services: a research review. Clin Child Fam Psychol Rev. 2000;3:223–41.PubMedView ArticleGoogle Scholar
  55. Wilson DB, Gottfredson DC, Najaka SS. School-based prevention of problem behaviors: a meta-analysis. J Quant Criminol. 2001;17:247–72.View ArticleGoogle Scholar
  56. Lyon AR, Cook CR, Brown EC, Locke J, Davis C, Ehrhart M, et al. Assessing organizational implementation context in the education sector: confirmatory factor analysis of measures of implementation leadership, climate, and citizenship. Implement Sci. 2018;13:5.PubMedPubMed CentralView ArticleGoogle Scholar
  57. Foster S, Rollefson M, Doksum T, Noonan D, Robinson G, Teich J. School mental health services in the United States, 2002-2003. Subst Abuse Ment Health Serv Adm 2005.Google Scholar
  58. Fazel M, Hoagwood K, Stephan S, Ford T. Mental health interventions in schools in high-income countries. Lancet Psychiatry. 2014;1:377–87.PubMedPubMed CentralView ArticleGoogle Scholar
  59. Eber L, Weist MD, Barrett S. An introduction to the interconnected systems framework. In: Advancing education effectiveness: an interconnected systems framework for positive behavioral interventions and supports (PBIS) and school mental health. Eugene, Oregon: University of Oregon Press; 2013. p. 3–17.Google Scholar
  60. Forman SG, Fagley NS, Chu BC, Walkup JT. Factors influencing school psychologists’ “willingness to implement” evidence-based interventions. Sch Ment Heal. 2012;4:207–18.View ArticleGoogle Scholar
  61. Lyon AR, Charlesworth-Attie S, Vander Stoep A, McCauley E. Research into practice. Sch Psychol Rev. 2011;40:569–81.Google Scholar
  62. Lyon AR, Ludwig K, Romano E, Koltracht J, Stoep AV, McCauley E. Using modular psychotherapy in school mental health: provider perspectives on intervention-setting fit. J Clin Child Adolesc Psychol. 2014;43:890–901.PubMedView ArticleGoogle Scholar
  63. Aarons GA, Ehrhart MG, Farahnak LR, Hurlburt MS. Leadership and organizational change for implementation (LOCI): a randomized mixed method pilot study of a leadership and organization development intervention for evidence-based practice implementation. Implement Sci. 2015;10:11.PubMedPubMed CentralView ArticleGoogle Scholar
  64. Mellin EA, Taylor L, Weist MD. The expanded school mental health collaboration instrument [school version]: development and initial psychometrics. Sch Ment Heal. 2014;6:151–62.View ArticleGoogle Scholar
  65. Zarb P, Coignard B, Griskeviciene J, Muller A, Vankerckhoven V, Weist K, et al. The European Centre for Disease Prevention and Control (ECDC) pilot point prevalence survey of healthcare-associated infections and antimicrobial use. Euro Surveill. 2012;17:1–16.View ArticleGoogle Scholar
  66. Mellin EA, Weist MD. Exploring school mental health collaboration in an urban community: a social capital perspective. Sch Ment Heal. 2011;3:81–92.View ArticleGoogle Scholar
  67. Splett JW, Perales K, Halliday-Boykins CA, Gilchrest CE, Gibson N, Weist MD. Best practices for teaming and collaboration in the interconnected systems framework. J Appl Sch Psychol. 2017;33:347–68.View ArticleGoogle Scholar
  68. Garland AF, Haine-Schlagel R, Brookman-Frazee L, Baker-Ericzen M, Trask E, Fawley-King K. Improving community-based mental health care for children: translating knowledge into action. Adm Policy Ment Health Ment Health Serv Res 2013;40:6–22.Google Scholar
  69. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:21.PubMedPubMed CentralView ArticleGoogle Scholar
  70. Lewis C, Boyd M, Beidas R, Lyon A, Chambers D, Aarons G, et al. A research agenda for mechanistic dissemination and implementation research. 2015.Google Scholar
  71. Hocevar S, Thomas G, Jansen E. Building collaborative capacity: an innovative strategy for homeland security preparedness. In: Innovation through Collaboration Emerald Group Publishing Limited; 2006. p. 255–74. https://doi.org/10.1016/S1572-0977(06)12010-5.Google Scholar
  72. Pajunen K. The nature of organizational mechanisms. Organ Stud. 2008;29:1449–68.View ArticleGoogle Scholar
  73. Insel T, Cuthbert B, Garvey M, Heinssen R, Pine DS, Quinn K, et al. Research domain criteria (RDoC): toward a new classification framework for research on mental disorders. Am J Psychiatry. 2010;167:748–51.PubMedView ArticleGoogle Scholar
  74. Insel TR. The NIMH research domain criteria (RDoC) project: precision medicine for psychiatry. Am J Psychiatry. 2014;171:395–7.PubMedView ArticleGoogle Scholar
  75. Williams NJ. Multilevel mechanisms of implementation strategies in mental health: integrating theory, research, and practice. Adm Policy Ment Health Ment Health Serv Res. 2016;43:783–98.View ArticleGoogle Scholar
  76. Palinkas LA, Aarons GA, Horwitz S, Chamberlain P, Hurlburt M, Landsverk J. Mixed method designs in implementation research. Adm Policy Ment Health Ment Health Serv Res. 2011;38:44–53.View ArticleGoogle Scholar
  77. Glisson C, Green P, Williams NJ. Assessing the organizational social context (OSC) of child welfare systems: implications for research and practice. Child Abuse Negl. 2012;36:621–32.PubMedView ArticleGoogle Scholar
  78. Locke J, Beidas RS, Marcus S, Stahmer A, Aarons GA, Lyon AR, et al. A mixed methods study of individual and organizational factors that affect implementation of interventions for children with autism in public schools. Implement Sci. 2016;11:135.PubMedPubMed CentralView ArticleGoogle Scholar
  79. Finn NK, Torres EM, Ehrhart MG, Roesch SC, Aarons GA. Cross-validation of the implementation leadership scale (ILS) in child welfare service organizations. Child Maltreat. 2016;21:250–5.PubMedView ArticleGoogle Scholar
  80. Ehrhart MG, Torres EM, Wright LA, Martinez SY, Aarons GA. Validating the implementation climate scale (ICS) in child welfare organizations. Child Abuse Negl. 2016;53(Supplement C):17–26.PubMedView ArticleGoogle Scholar
  81. Ehrhart MG, Aarons GA, Farahnak LR. Going above and beyond for implementation: the development and validity testing of the implementation citizenship behavior scale (ICBS). Implement Sci IS. 2015;10:65.PubMedView ArticleGoogle Scholar
  82. Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: the evidence-based practice attitude scale (EBPAS). Ment Health Serv Res. 2004;6:61–74.PubMedPubMed CentralView ArticleGoogle Scholar
  83. Weiner BJ, Lewis CC, Stanick C, Powell BJ, Dorsey CN, Clary AS, et al. Psychometric assessment of three newly developed implementation outcome measures. Implement Sci. 2017;12:108.PubMedPubMed CentralView ArticleGoogle Scholar
  84. Corp IBM. IBM SPSS Statistics for Windows, version 19.0. IBm Corp: Armonk, NY; 2010.Google Scholar
  85. StataCorp LP. Stata multilevel mixed-effects reference manual. College Station (Texas): Stata Press; 2013. https://www.stata.com/manuals13/me.pdf. Accessed 27 Sep 2016.Google Scholar
  86. Raudenbush SW, Bryk AS. Hierarchical linear models: applications and data analysis methods. SAGE; 2002.Google Scholar
  87. Core Team R. R 3.4.2: a language and environment for statistical. Computing. 2017;Google Scholar
  88. Edwards JR. Alternatives to difference scores as dependent variables in the study of congruence in organizational research. Organ Behav Hum Decis Process. 1995;64:307–24.View ArticleGoogle Scholar
  89. Page TJ, Spreng RA. Difference scores versus direct effects in service quality measurement. J Serv Res. 2002;4:184–92.View ArticleGoogle Scholar
  90. Aarons GA, Ehrhart MG, Farahnak LR, Finn N. Implementation leadership: confirmatory factor analysis and supervisor-clinician discrepancy in ratings on the implementation leadership scale (ILS). Implement Sci. 2015;10:A70.PubMed CentralView ArticleGoogle Scholar
  91. Aarons GA, Ehrhart MG, Farahnak LR, Sklar M, Horowitz J. Discrepancies in leader and follower ratings of transformational leadership: relationship with organizational culture in mental health. Adm Policy Ment Health Ment Health Serv Res. 2017;44:480–91.View ArticleGoogle Scholar
  92. Shanock LR, Baran BE, Gentry WA, Pattison SC, Heggestad ED. Polynomial regression with response surface analysis: a powerful approach for examining moderation and overcoming limitations of difference scores. J Bus Psychol. 2010;25:543–54.View ArticleGoogle Scholar
  93. Leon AC, Davis LL, Kraemer HC. The role and interpretation of pilot studies in clinical research. J Psychiatr Res. 2011;45:626–9.PubMedView ArticleGoogle Scholar
  94. Kraemer HC, Mintz J, Noda A, Tinklenberg J, Yesavage JA. Caution regarding the use of pilot studies to guide power calculations for study proposals. Arch Gen Psychiatry. 2006;63:484–9.PubMedView ArticleGoogle Scholar
  95. Snijders TAB. Power and sample size in multilevel linear models. In: Encyclopedia of Statistics in Behavioral Science John Wiley & Sons, Ltd; 2005. doi:https://doi.org/10.1002/0470013192.bsa492.
  96. Gulliford MC, Ukoumunne OC, Chinn S. Components of variance and intraclass correlations for the design of community-based surveys and intervention studies: data from the health survey for England 1994. Am J Epidemiol. 1999;149:876–83.PubMedView ArticleGoogle Scholar
  97. Kazdin AE. Mediators and mechanisms of change in psychotherapy research. Annu Rev Clin Psychol. 2007;3:1–27.PubMedView ArticleGoogle Scholar
  98. Creswell JW, Klassen AC, Plano Clark VL, Smith KC. Best practices for mixed methods research in the health sciences. Office of Behavioral and Social Sciences Research: Bethesda, MD; 2011.View ArticleGoogle Scholar
  99. Creswell JW. Clark VLP. Designing and conducting mixed methods research: Sage Publ; 2007.Google Scholar
  100. Edhlund BM. NVivo 11 essentials: your guide to the world’s most powerful data analysis software. 2016.Google Scholar
  101. Bradley EH, Curry LA, Devers KJ. Qualitative data analysis for health services research: developing taxonomy, themes, and theory. Health Serv Res. 2007;42:1758–72.PubMedPubMed CentralView ArticleGoogle Scholar
  102. Glaser BG, Strauss A. The discovery of grounded theory: strategies for qualtitative research. Chicago: Aldine; 1967.Google Scholar
  103. Strauss AC, Corbin J. Basics of qualitative research: grounded theory procedures and techniques. Sage Publications; 1990.Google Scholar
  104. DeSantis L, Ugarriza DN. The concept of theme as used in qualitative nursing research. West J Nurs Res. 2000;22:351–72.PubMedView ArticleGoogle Scholar
  105. Hill CE, Thompson BJ, Williams EN. A guide to conducting consensual qualitative research. Couns Psychol. 1997;25:517–72.View ArticleGoogle Scholar
  106. Hill CE, Knox S, Thompson BJ, Williams EN, Hess SA, Ladany N. Consensual qualitative research: an update. J Couns Psychol. 2005;52:196.View ArticleGoogle Scholar
  107. Lyon AR, Ludwig K, Romano E, Leonard S, Stoep AV, McCauley E. “If it’s worth my time, I will make the time”: school-based providers’ decision-making about participating in an evidence-based psychotherapy consultation program. Adm Policy Ment Health Ment Health Serv Res. 2013;40:467–81.View ArticleGoogle Scholar
  108. Lyon AR, Ludwig K, Wasse JK, Bergstrom A, Hendrix E, Mcauley E. Determinants and functions of standardized assessment use among school mental health clinicians: a mixed methods evaluation. Adm Policy Ment Health Ment Health Serv Res. 2016;43:122–34.View ArticleGoogle Scholar
  109. Guest G, Bunce A, Johnson L. How many interviews are enough? An experiment with data saturation and variability. Field Methods. 2006;18:59–82.View ArticleGoogle Scholar
  110. Sandelowski M. Sample size in qualitative research. Res Nurs Health. 1995;18:179–83.PubMedView ArticleGoogle Scholar
  111. Adelman H, Taylor L. Moving prevention from the fringes into the fabric of school improvement. J Educ Psychol Consult. 2000;11:7–36.View ArticleGoogle Scholar
  112. Forman SG, Olin SS, Hoagwood KE, Crowe M, Saka N. Evidence-based interventions in schools: developers’ views of implementation barriers and facilitators. Sch Ment Heal. 2009;1:26.View ArticleGoogle Scholar
  113. Jennings J, Pearson G, Harris M. Implementing and maintaining school-based mental health services in a large, urban school district. J Sch Health. 2000;48:201.View ArticleGoogle Scholar
  114. Knies K. The influence of organizational climate on the use and quality of evidence-based practices in school mental health: M.A. University of South Carolina; 2014. https://search.proquest.com/docview/1651954005/abstract/CAD7469944544606PQ/1. Accessed 28 Dec 2017
  115. Weist MD, Mellin EA, Chambers KL, Lever NA, Haber D, Blaber C. Challenges to collaboration in school mental health and strategies for overcoming them. J Sch Health. 2012;82:97–105.PubMedView ArticleGoogle Scholar
  116. Bacchetti P. Current sample size conventions: flaws, harms, and alternatives . BioMedCent Med 2010;8:17.Google Scholar
  117. Cumming G. The new statistics: why and how. Psychol Sci. 2014;25:7–29.PubMedView ArticleGoogle Scholar
  118. Gelman A. Going beyond the book: towards critical reading in statistics teaching. Teach Stat. 2012;34:82–6.View ArticleGoogle Scholar
  119. Bruns EJ, Suter JC, Force MM, Burchard JD. Adherence to wraparound principles and association with outcomes. J Child Fam Stud. 2005;14:521–34.View ArticleGoogle Scholar
  120. Schoenwald SK, Garland AF, Chapman JE, Frazier SL, Sheidow AJ, Southam-Gerow MA. Toward the effective and efficient measurement of implementation fidelity. Adm Policy Ment Health Ment Health Serv Res. 2011;38:32–43.View ArticleGoogle Scholar

Copyright

© The Author(s). 2018

Advertisement