Skip to main content

Forms and functions of bridging factors: specifying the dynamic links between outer and inner contexts during implementation and sustainment

This article has been updated

Abstract

Background

Bridging factors are relational ties, formal arrangements, and processes that connect outer system and inner organizational contexts. They may be critical drivers of evidence-based practice (EBP) implementation and sustainment. Yet, the complex interplay between outer and inner contexts is often not considered. Bridging factors were recently defined in the updated Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Further identification and specification of this construct will advance implementation models, measures, and methods. Our goal is to advance bridging factor research by identifying relevant dimensions and exemplifying these dimensions through illustrative case studies.

Methods

We used a multiple case study design. Each case (n = 10) represented different contexts, EBPs, and bridging factor types. Inclusion criteria were the presence of clearly distinguishable outer and inner contexts, identifiable bridging factor, sufficient information to describe how the bridging factor affected implementation, and variation from other cases. We used an iterative qualitative inquiry process to develop and refine a list of dimensions. Case data were entered into a matrix. Dimensions comprised the rows and case details comprised the columns. After a review of all cases, we collectively considered and independently coded each dimension as function or form.

Results

We drew upon the concepts of functions and forms, a distinction originally proposed in the complex health intervention literature. Function dimensions help define the bridging factor and illustrate its purpose as it relates to EBP implementation. Form dimensions describe the specific structures and activities that illustrate why and how the bridging factor has been customized to a local implementation experience. Function dimensions can help researchers and practitioners identify the presence and purpose of bridging factors, whereas form dimensions can help us understand how the bridging factor may be designed or modified to support EBP implementation in a specific context. We propose five function and three form bridging factor dimensions.

Conclusions

Bridging factors are described in many implementation models and studies, but without explicit reference or investigation. Bridging factors are an understudied and critical construct that requires further attention to facilitate implementation research and practice. We present specific recommendations for a bridging factors research agenda.

Peer Review reports

Background

The implementation and sustainment of evidence-based practices (EBPs) requires simultaneous and continued coordination, support, and engagement of actors across outer and inner contexts [1,2,3,4]. Bridging factors span the outer and inner contexts and are crucial for this to occur. Bridging factors are defined as “factors that cross or link the outer system and inner organizational context” [5]. To date, there has been a heavy focus on inner context implementation work [5,6,7]. Research on the outer context has focused on how to affect policy within the outer context (e.g., through  targeted packaging and dissemination of research evidence) in ways that can be used at different jurisdictional levels, including whole countries [8, 9]. Although there have been calls for further efforts to transfer knowledge to outer context policymakers and system leaders [10, 11], this knowledge transfer has lagged in implementation science. Furthermore, less attention has been paid on how to leverage bridging factors between the outer and inner contexts to support EBP implementation and sustainment.

Bridging factors are the connective tissue between system parts and may be broadly organized as relational ties, formal arrangements, and processes [12]. Bridging factors may be relational ties, such as a partnership between a government agency and community-based organizations, or an EBP intermediary that facilitates intervention adaptation and training with system and organizational leaders [5, 12]. They may also be formal arrangements, including contracts between public sector agencies and local nonprofits, or a policy that provides a fiscal incentive for EBP use and facilitates EBP integration [12]. Other bridging factors may be process-oriented, such as data sharing procedures between state and local entities or accreditation, which links program developers with implementing organizations [12].

The need for a bridging factors research agenda to advance implementation research and practice

We assert that identifying and leveraging bridging factors is an urgent priority for implementation research. First, understanding the interconnections between outer and inner contexts may be critical for EBP implementation and sustainment. This need is espoused by multiple implementation framework developers, but is often not addressed in implementation research. For example, the Dynamic Sustainability Framework states that further specification of the levels within a system and the interrelationships between them is an important area for future research [13]. The Integrated Sustainability Framework also notes that “dynamic interactions between outer contextual factors, inner contextual or organizational factors, processes, intervention characteristics, and implementer characteristics influence sustainability” [14]. The bridging factors construct can help us outline how specific relational structures, formal arrangements, and processes that connect outer and inner contexts support or hinder EBP sustainment.

Bridging factors research is also an urgent priority because it can advance the range of existing conceptual frameworks that acknowledge both outer and inner contexts for EBP implementation. Recent work identifies bridging factors in the Exploration-Preparation-Implementation-Sustainment (EPIS) framework and demonstrates a methodological approach for studying bridging factors using contracting arrangements as an example [5, 12]. However, there is variability in the acknowledgement of the bridging factors construct within other commonly used implementation frameworks. For example, the Consolidated Framework for Implementation Research (CFIR), Reach Effectiveness-Adoption Implementation Maintenance (RE-AIM), and Interactive Systems Framework (ISF) allude to the reciprocal effects, processes, or influences between outer system and inner organizational implementation actors [15,16,17]. The CFIR is inclusive of the outer and inner context stakeholders who may take part in activities in the Process domain [15]. The ISF explicitly includes the reciprocal effects between the “delivery system,” “support system,” and “synthesis and translation system,” all of which span the outer policy and inner organizational contexts [17]. Explicitly including bridging factors as a way to understand the relationships between outer and inner contexts can enhance our use of existing implementation frameworks.

Bridging factors are distinct from implementation strategies. However, they can enhance two streams of implementation strategy research and practice: (1) multifaceted, large-scale strategies (e.g., those that involve entities within an entire county, province, prefecture, or state), and (2) discrete, inner context-focused strategies. Multifaceted, large-scale implementation strategies that require coordination among outer and inner actors are understudied compared to other strategy types [10, 18,19,20,21]. Examples in the Expert Recommendations for Implementing Change project include accessing new funding, altering incentives for implementation or creating disincentives (de-implementation) structures, modifying payments or fees, building a coalition or community-academic partnership, and staging implementation scale-up [22]. Bridging factors may be planned implementation strategies or an unplanned contextual feature that activates or impedes the use of a multifaceted, large-scale strategy. For example, a community-academic partnership may consist of outer context actors (the academic institution and service system leaders) and inner context actors (in local organizations). The partnership may be created to intentionally support the implementation of a new EBP. However, this partnership could also be an existing contextual feature that helps to activate a different multifaceted implementation strategy (e.g., staging EBP scale-up or designing new incentive or disincentive structures). This example shows how bridging factors can be applied to existing implementation research topics including community-academic partnerships (e.g., [23,24,25]), learning collaboratives and learning communities (e.g., [26,27,28]), and inter-organizational collaboration [4, 29].

Identification of bridging factors can also enhance research on discrete, inner context-focused implementation strategies. Although much progress has been made in specifying and reporting discrete implementation strategies, we often do not account for the outer-inner context dynamics that these strategies are used within. In fact, information about context is noticeably absent in published reporting guidelines [27, 30]. This tension is exemplified in technology-enabled services implementation (i.e., online or mobile platforms to deliver healthcare) wherein the ability to execute common inner context-focused implementation strategies may be supported or constrained by the relationship that inner context actors have with the broader service environment [31].

In another example of a recent organizationally-focused implementation trial, a statewide policy initiative distracted parties (across all levels of the organization, including executives, supervisors, and providers) from the specific EBP implementation goal, as they were required to meet new documentation and process obligations to maintain their organizational funding [32, 33]. Without specifying this bridging factor (the statewide policy initiative), it may appear that the implementation strategy itself was less effective. Specifying and reporting bridging factors can help implementers adapt inner context-focused strategies to fit outer context characteristics and disruptions.

Further, this example illustrates how the success of the same strategy in different settings may be more fully explained by bridging factors. In the context of implementation research, mechanisms are defined as “a process or event through which an implementation strategy operates to affect desired implementation outcomes” [34]. Mechanisms are multilevel and can be activated by implementation strategies at intrapersonal, organizational, community, and macro policy levels of analysis [34, 35]. While a bridging factor in and of itself is not a mechanism, explicating relevant bridging factors can shed light onto multilevel mechanisms that explain implementation strategy effectiveness. The bridging factors construct draws our attention not only to the multilevel nature of strategies and their mechanisms, but also to the bi-directional (e.g., top down and bottom up) processes that may explain how a strategy works.

Integrating bridging factors with existing organizational theories and concepts

We acknowledge that organizations operate within the external environment and this is the basis of longstanding theories that fall within a broader open systems perspective. Open systems theorists treat organizations as permeable. The external environment “shape[s], support[s], and infiltrate[s] organizations” and organizations exchange resources, information, and personnel with entities outside the boundaries of the inner context [36]. Complexity theory, for example, centers on how systems consist of multiple components that relate toand interact with one another in dynamic ways [37, 38]. Transaction cost economics, institutional, contingency, and resource dependence theories can also help to frame research questions about how the outer context influences the inner organizational context over the course of implementation [39,40,41,42,43].

Of note, bridging factors are alluded to in resource dependence theory and are described as an organization’s “efforts to control or in some manner coordinate one’s actions with those of formally independent entities” [36]. According to this theory, specific bridging tactics include forming alliances or merging with other organizations, and each tactic represents a way that an organization can manage interdependence with the broader environment [36]. These bridging tactics may be operationalized within the construct of bridging factors by looking at specific formal arrangements (e.g., contracts between organizations) or processes (e.g., data sharing or staff integration processes when a merger takes place) that influence EBP implementation.

There is also the concept of bridging organizations, defined as independent organizations whose role is to link center (high status, high power) and border (low status, low power) organizations [44]. Bridging organizations bring together diverse stakeholders to facilitate inter-organizational collaboration and coordination [45]. They help to build trust and resolve conflict, and provide an opportunity for sensemaking and learning across organizations [45]. They can also help disseminate shared visions and new innovations [46]. EBP intermediary/purveyor organizations are an example of bridging organizations that connect intervention developers with implementing organizations [47]. They build agency and system capacity through consultation, technical assistance, outcome evaluation, and training in the intervention [47]. Bridging organizations may be in the relational tie category of bridging factors.

Finally, there are similar individual and team level concepts in the social network literature, for example, bridges and brokers, which fall within the broader concept of boundary spanners [48,49,50,51,52,53]. Boundary spanners close structural holes between networks and serve many of the same functions as bridging organizations, e.g., building trust, resolving conflict, and acting as a conduit for resource and information exchange [49, 54, 55]. Individuals or teams who serve a boundary spanning role between different outer and inner context social networks may also be in the the relational tie category of bridging factors. Bridging factors research can supplement existing theories by identifying concrete and malleable bi-directional linkages, and articulating the dynamic interplay that occurs between organizations and the outer context as an EBP becomes embedded in an implementation setting.

Summary of our argument

We argue that enhanced identification, operationalization, and understanding of bridging factors will allow researchers and practitioners to proactively investigate and leverage them during implementation efforts. This can support the development of strategies that support implementation processes --from adoption to sustainment-- and improve the public health impact of EBPs. The goals of this paper are to (1) raise awareness about the importance of specifying and reporting bridging factors, (2) advance the bridging factors construct through a presentation of dimensions and illustrative case studies, and (3) discuss recommendations for bridging factors research.

Methods

Case selection

We used a multiple case study design [56] and iteratively reviewed cases to inform the development and refinement of a list of dimensions of bridging factors. We assessed the unique features of each case and patterns across the cases [56]. Cases were purposefully sampled to ensure diversity across EBP, context, and bridging factor type. To ensure breadth of examples, we drew upon cases from the authors’ research experiences and from discussions with colleagues who conduct multilevel implementation research and engage in implementation practice. During these discussions, we sought to gather enough detail to understand what the bridging factor was and identify specific ways that it affected implementation processes in both the outer and inner contexts. Inclusion criteria for the cases were (1) the presence of clearly distinguishable outer and inner contexts, (2) an identifiable bridging factor, (3) sufficient information to describe how the bridging factor affected implementation, and (4) variation from other cases.

Code development and application

Each member of the research team independently generated a list of potentially relevant bridging factor dimensions. Following independent list development, we convened for six consensus-building meetings to create and refine a dimension list, discuss each dimension’s purpose (function or form), and compare our conceptualization of the dimension with material from the cases [57]. During the meetings, we developed and used a matrix whereby the list of dimensions comprised the rows and examples from the cases comprised the columns [58]. Based on our discussion of the cases and our collective reflection about the type of information gleaned for each of the dimensions, we (RL, NS, KD, and JM) independently coded each dimension as one that describes the function or the form of the bridging factor (described below). Given the complexity of the form dimensions, we anchored sub-categories within broader categories to organize this information. We discussed each discrepancy and GA acted as a tiebreaker because he is the most senior researcher and experienced in cross-context implementation research.

Sample

We reviewed 10 cases representing three types of bridging factors: relational ties (n = 5), formal arrangements (n = 3), and processes (n = 2). Four cases were excluded from our final sample because the EBP was already represented in two other SafeCare-focused cases (n = 1), detailed information was not publicly available (n = 1), outer and inner context boundaries were not sufficiently clear (n = 1), and the information provided was not distinct enough from other relational tie examples (n = 1). The outer contexts in our sample were a public sector child welfare system, public sector mental health system, local government in low- and middle-income country (LMIC) setting, substance use treatment organizations in the community, the State, local and state health departments, program developers, and a university medical center. The inner contexts were organizations delivering the EBP(s), churches, local child welfare agencies, public sector mental health systems, and a state-run prison. The cases illustrated bridging factors across very different context and service settings. Case information is summarized in Table 1 and full case study reports are presented in the Additional file 1.

Table 1 Summary of case studies

Results

Bridging factor dimensions: organized by functions and forms

To categorize our bridging factor dimensions, we drew upon the concepts of functions and forms, a distinction that was originally proposed in the complex health intervention literature [59,60,61]. We conceptualize function bridging factor dimensions as core characteristics that define the bridging factor and speak to its purpose as it relates to EBP implementation. We conceptualize form bridging factor dimensions as characteristics that describe the specific structures, activities, and strategies that illustrate why and how the bridging factor has been customized to a local implementation experience. We envision that function dimensions will help researchers and practitioners identify the presence and purpose of a bridging factor, while the form dimensions will help researchers and practitioners understand how the bridging factor may be purposefully designed or modified to support EBP implementation in a specific context. We propose five function bridging factor dimensions and three form bridging factor dimensions. Each form dimension has three sub-categories. The dimensions are described below and summarized in Table 2.

Table 2 Bridging factor dimensions

Five function bridging factor dimensions

  1. 1.

    Type (relational tie, formal arrangement, and/or process) [6]

  2. 2.

    Outer context

  3. 3.

    Inner context

  4. 4.

    Capital exchanged across the boundary of the defined outer and inner contexts

  5. 5.

    Impact the bridging factor has on the outer and/or inner contexts

The first dimension is specification of bridging factor type and we expect new types to emerge in future research. The next two function dimensions are identification of the outer and inner contexts, which is essential for defining what the “bridge” consists of. Our case studies reinforced the nested nature of implementation contexts (e.g.,  counties within states) and the importance of explicitly defining a project’s contextual boundaries. For the fourth dimension (capital exchanged) we take a broad view of the human, social, fiscal, knowledge, and time-related capital that may be relevant across the implementation phases. Examples include funding, EBP expertise and knowledge, client referrals, training and coaching capacity, social norms, program data and client information, and communication between individuals. For the fifth dimension (impact on outer and/or inner context), we assert that bridging factors can positively and/or negatively influence EBP implementation and sustainment. Case 8, for example, illustrates how a bridging factor can negatively affect implementation in the outer and inner contexts. Furthermore, because bridging factors span both contexts, we expect to see top-down (outer influencing inner) and bottom-up (inner influencing outer) processes. This bi-directional influence is reflected in Table 1, Function Row 5 which describes the impact that a bridging factor can have on outer and inner contexts. Our cases revealed that some bridging factors impacted both contexts in a similar way (e.g., cases that denote the same impact on outer and inner). However, bridging factors may influence outer and inner contexts differently (cases 9, 10) or potentially only show a tangible impact in either the outer or inner context (case 7). We acknowledge the limitations of retrospectively describing impact and expect the bi-directional influence and interaction that occurs through bridging factors to be more clearly described in future studies that prospectively and systematically examine top-down and bottom-up bridging processes.

Three form bridging factor dimensions

  1. 1.

    Origin

    1. a.

      Rationale

    2. b.

      Implementation strategy

    3. c.

      Regulatory context

The origin dimension helps researchers and practitioners understand the source of the bridging factor. The sub-categories are (a) rationale for the creation of the bridging factor, (b) the degree to which the bridging factor is a planned and deliberate implementation strategy (e.g., changing service provision contracts to require EBP implementation), and (c) regulatory context—that is, whether the bridging factor is enforceable (cases 1, 2, 5, 8), mandatory (cases 5, 7, 8), encouraged (cases 3, 4), and/or voluntary (cases 2, 3, 6, 9, 10). The origin dimension helps illustrate aspects of the bridging factor that are modifiable or fixed, and which stakeholders may need to be engaged as the bridging factor is developed or modified.

  1. 2.

    Dynamism

    1. a.

      Duration

    2. b.

      Change across the implementation phases

    3. c.

      Supports

This dimension describes the dynamic or adaptive nature of the bridging factor and has three sub-categories. First (a) is the duration of the bridging factor. Knowing if the bridging factor is short- or long-term can help researchers and practitioners assess when it may influence EBP implementation, or at what phase the bridging factor will have the most influence. What constitutes a short- and long-term bridging factor is dependent on the implementation context and specific bridging factor. Second (b) is how the bridging factor changes or is expected to change across the implementation phases. Describing if and how the bridging factor changes (e.g., for political, regulatory, societal, or policy reasons) can bring to light environmental opportunities and constraints that affect the degree to which the bridging factor can be leveraged as a formal implementation strategy. Third, (c) bridging factor supports articulates two specific elements: needed resources or structures that support the bridging factor and whether or not the bridging factor leverages existing or requires new resources and supports. Bridging factor supports may include financial investment, contract requirements, or legislative mandates. We emphasize that these supports are directly related to the bridging factor and not supports for more narrowly defined EBP implementation activities.

  1. 3.

    Scope

    1. a.

      Multiple systems

    2. b.

      General or specific

    3. c.

      Outcomes

This dimension describes the scope of the bridging factor. The first sub-category (a) is whether the bridging factor links multiple service systems such as child welfare and mental health or federal, state, provincial, and regional health systems. The second sub-category (b) is whether the bridging factor is general across EBPs or specific to a particular EBP (case 3). These sub-categories inform the breadth and depth of the bridging factor and have implications for the types of stakeholders needed to effectively link within and across systems and in service of one or multiple EBPs. This is important given that bridging factors often involve key partnerships and leveraging resources, processes, and arrangements that require stakeholder involvement in their creation or maintenance (even if the bridging factor is mandatory). The last sub-category (c) captures the impact (actual, potential, or expected) that the bridging factor has or is intended to have on outcomes.

In summary, our cases illustrate how contracting arrangements, fiscal incentives, partnerships, earmarked taxes, interagency collaborations, data sharing and accreditation processes, and even individuals can act as bridging factors that link an outer and inner context during EBP adoption, implementation, and sustainment. Recognizing the diversity of bridging factors (in our cases as well as those yet to be identified), we offer function and form dimensions as a way to organize and report information across implementation studies, with the hopes of building a bridging factors knowledge base. Function dimensions include specifying the type of bridging factor, describing the outer and inner contexts (the contextual boundaries of the bridge), capital exchanged, and impact on outer and inner contexts. The form dimensions help us understand what a bridging factor looks like in a specific implementation setting. These dimensions are origin, dynamism, and scope (with various sub-forms in each). We next outline our bridging factors research agenda.

Recommendations for a bridging factor research agenda

For theory

There are numerous ways that the bridging factors construct can be utilized in implementation theories, frameworks, and models. Bridging factors provide a way to more deeply consider how outer and inner contexts are related and what strategies might be used to bridge them. In turn, those approaches might generalize to other public health concerns, diseases, or clinical and service settings. Considering how bridging factors operate similarly or differently across levels and in various settings (e.g., countries, provinces, states, counties, etc.) combined with frameworks such as EPIS, CFIR, ISF, and others can help us develop ways to integrate, understand, and assess the differential impact and tailoring needs of bridging strategies for specific contexts. For example, if we narrow consideration to setting and clinical concern (e.g., national health system and addressing depression) there might be a number of multilevel mechanisms by which the outer context can influence clinical practice through a planned implementation strategy (e.g., medication formularies, funding for psychosocial interventions). This may vary by context, for example, in LMICs where mental health may be a lower priority than other public health concerns.

It may also be helpful to consider bridging factors within the major types of frameworks categorized by Nilsen [62]. As noted by Nilsen, determinant frameworks may be limited in regard to the process of conducting implementation [62]. To address this gap, determinant frameworks might be adapted to consider bridging factors.Process models may be tailored and/or adapted to consider how bridging factors influence key processes, including the critical multilevel mechanisms that allow strategies at one level to influence implementation outcomes at another level [63]. Finally, evaluation frameworks may benefit from actively including the assessment of bridging factors as they may support or hinder an implementation effort. Regardless of the category of model, theory, or framework, we recommend considering and integrating bridging factors.

For measurement and design

Bridging factors add another element to consider in research design and measurement. It could be argued that without due consideration of bridging factors, studies do not portray or assess the full picture of implementation, which may result in Type 3 errors. This defies the principles of implementation science, and may inadvertently lead to inappropriate data analyses if the bi-directional and multilevel nature of implementation influences are not considered. The multilevel nature of bridging factors needs to be accounted for in any analysis that involves these cross-level linkages. Some bridging factors may be captured using established quantitative scales (e.g., leadership). However, a recent review of measures of outer context constructs used in behavioral health research highlighted the limited quantitative investigation of system-level and policy influences both in terms of quantity and quality [64]. Interestingly, 7 of the 20 measures found in this review were related to cosmopolitanism, which is defined in CFIR as the “degree to which an organization is networked with other external organizations,” and thus links to the concept of bridging factors. Unfortunately, the quality of the measures was poor with the highest psychometric score being 5 out of possible 36. Similar reviews have also been conducted in public health and community settings [64] and health care quality improvement initiatives [65].

Nevertheless, we consider qualitative methods as being key to measuring and understanding bridging factors. Qualitative methods are particularly well suited to assess complexity and context and may be included in mixed methods studies. Qualitative methods (e.g., interviews and focus groups) may elicit bridging factors that were not considered in the study development or factors that were not present in the early phases of the implementation process. Qualitative methods are also able to capture the dynamic nature of bridging factors and the varying degree of influence that they have throughout the implementation process from different outer and inner stakeholder perspectives. We recommend that implementation researchers consider pursuing funding for in-depth, semi-structured interviews and/or focus groups to describe and explain the influence of bridging factors on implementation strategies, processes, and outcomes. The questions posed in Table 2 can be a foundation for interview scripts.

For reporting

As our cases suggest, bridging factors can influence the implementation process, and they may facilitate or hinder activities within each phase of implementation. They may also interact with implementation strategies and illuminate the mechanisms integrated within these efforts. For example, case 6 underscored the importance of the capital exchanged to facilitate the preparation, implementation, and sustainment of the California Professional Training and Information Network (CAPTAIN) [66]. This included leveraging social capital through the involvement of key stakeholders spanning multiple levels and the incorporation and alignment of existing infrastructure and resources. However and similar to the history of implementation strategies and mechanisms [34, 67], there has been limited reporting and specifying of bridging factors to date, likely due to limited awareness and development of guidance around this key group of implementation factors. We recommend that further specifying and reporting of such factors is needed to continue advancing the field and extending the impact of implementation efforts. This includes explicit identification, provision of detailed descriptions, and measurement of their impact [30].

Similar to the rationale for further specification of other implementation methods, bridging factor specification will help  efforts to replicate or scale-up, especially given their potential as key determinants. Efforts to replicate and/or scale-up the CAPTAIN model statewide, for example, would be greatly impeded if the alignment between existing outer context priorities and policies and involvement of existing inner context resources had not been explicitly specified and reported, permitting outer-inner context integration in further implementation efforts [68]. In addition to the proposed dimensions, the growing set of guidelines and recommendations for specifying and reporting implementation methods (e.g., use of implementation strategies and implementation frameworks [67, 69]) aligns well with this recommendation.

For outcomes research

Because bridging factors bi-directionally influence outer and inner contexts, they may influence implementation outcomes across contexts. For example, our partnership case studies illustrated how bridging factors can influence implementation outcomes in both outer and inner contexts. Partnerships developed from shared goals, trust, and high levels of engagement among outer and inner context stakeholders may positively impact levels of sustainment, fidelity, and reach in LMIC faith-based organizations (case 3), child welfare (case 4), and autism service settings (case 6). Case 8 demonstrated how the dissolution of a forced (e.g., legislatively or structurally mandated) partnership helped to explain why an EBP was discontinued despite the presence of successful implementation and clinical outcomes.

As the field of implementation science evolves, and with the hope that bridging factor conceptualization continues to be refined, it will be important for researchers to systematically and rigorously test the impact of bridging factors on implementation outcomes and the contextual considerations that can explain this impact. For example, the question of under which conditions certain bridging factors positively or negatively impact implementation outcomes is ripe for future investigation. We suggest using Proctor et al.’s taxonomy as a starting point to evaluate the impact of bridging factors [70].

For research about implementation phases and stages

In addition to their role in supporting key cross-context linkages between the outer and inner contexts, bridging factors are also a cross-cutting feature across implementation phases and stages. In practice, the impact of a bridging factor may only be felt in one or two phases or stages of implementation. Bridging factors may be specifically designed to be short-term or time limited. However, there is immense potential for these factors, especially those designated as long-term, to exert their impact across all phases or stages of implementation. Bridging factors can also impact the process and progress within an implementation stage, serving to accelerate or impede the completion of implementation activities within each stage and the number of stages completed. For example, case 2 highlighted how the formal oversight structure and supports that accompanied the fiscal incentives during the preparation phase facilitated large-scale EBP implementation and sustainment countywide [71]. Given the dynamic and evolving nature of bridging factors, modifications may be made to accommodate the varying foci or goals of each implementation phase or stage, as well as be a response to the evolving demands of the outer and/or inner contexts.

For practice and policy

Well-specified bridging factors have great potential to serve as policy levers that can inform and change practice. A clear understanding of the roles of bridging factors and their impacts on outer and inner context variables, including implementation outcomes, can help guide decisions about policies that inform EBP implementation and practice delivery. For example, case 5 suggested that the structure of a policy (taxes) has meaningful implications on decisions for service delivery (e.g., types of services, interventions selected, workforce capacity to deliver services). Similarly, case 2 illustrated the importance of a solid and stable infrastructure to support the execution and ongoing maintenance of a system-wide transformation of care delivery. As described in earlier sections, it is critical for bridging factors to be well specified and defined.

Challenges

We found that some bridging factors dimensions were easier to identify and report (e.g., if it is a planned implementation strategy) than others (e.g., capital exchanged). Additionally, in-depth system knowledge was necessary to understand how bridging factors operated and influenced implementation processes. This required collecting data from individuals who had rich experience with a particular setting. Some information may also be sensitive or of a political nature and therefore more difficult to gather and use. Furthermore, we found that the nested nature of the outer and inner contexts can make it difficult to define the boundaries of a particular bridging factor. It is also likely that there are similar processes that bridge levels within outer and inner contexts. That being said, we encourage researchers to be as specific as possible when identifying the outer and inner contexts of interest. A final challenge of conducting bridging factors research  is that operationalization is still in its infancy. As a result, existing literature does not explicitly report and measure bridging factors and we had to rely on our research experiences and discussions. We hope that this manuscript offers initial guidance for reporting and measuring bridging factors in future work.

Conclusions

In implementation science, bridging factors research is in its early stages. Previous work formally added the construct to the EPIS framework and illustrated a specific methodological approach and contracting example [5, 12]. This paper recognizes that the next steps are to report and measure bridging factors so that they can be systematically tested and proactively leveraged during EBP implementation and sustainment. Our primary task for this paper was to make bridging factors more concrete and specific. To achieve this, we examined diverse bridging factor cases and articulated dimensions that can help to describe bridging factors functions and forms. Finally, we put forth recommendations for theory, measurement and design, reporting, outcomes, and research that spans the implementation phases and stages. We also began a discussion of the practice and policy implications of this work. We hope that this paper inspires future bridging factors research so that we can strengthen linkages across outer and inner contexts, improve implementation outcomes, and enhance the public health impact of EBPs.

Availability of data and materials

Not applicable.

Change history

  • 14 April 2021

    Due to a typesetting mistake a text fragment was inserted in the original publication. The article has been updated to rectify the error.

Abbreviations

EBPs:

Evidence-based practice

EPIS:

Exploration, Preparation, Implementation, and Sustainment

CFIR:

Consolidated Framework for Implementation Research

RE-AIM:

Reach Effectiveness-Adoption Implementation Maintenance

ISF:

Interactive Systems Framework

LMIC:

Low- and middle-income country

References

  1. 1.

    Aarons GA, Green AE, Trott E, Willging CE, Torres EM, Ehrhart MG, Roesch SC. The roles of system and organizational leadership in system-wide evidence-based intervention sustainment: a mixed method study. Adm Policy Ment Health. 2016;43(6):991–1008. https://doi.org/10.1007/s10488-016-0751-4.

    Article  PubMed  PubMed Central  Google Scholar 

  2. 2.

    Aarons GA, Ehrhart MG, Farahnak LR, Sklar M. Aligning leadership across systems and organizations to develop a strategic climate for evidence-based practice implementation. Annu Rev Public Health. 2014;35(1):255–74. https://doi.org/10.1146/annurev-publhealth-032013-182447.

    Article  PubMed  PubMed Central  Google Scholar 

  3. 3.

    Green AE, Trott E, Willging CE, Finn NK, Ehrhart MG, Aarons GA. The role of collaborations in sustaining an evidence-based intervention to reduce child neglect. Child Abuse Negl. 2016;53:4–16. https://doi.org/10.1016/j.chiabu.2015.11.013.

    Article  PubMed  Google Scholar 

  4. 4.

    Aarons GA, Fettes DL, Hurlburt MS, Palinkas LA, Gunderson L, Willging CE, Chaffin MJ. Collaboration, negotiation, and coalescence for interagency-collaborative teams to scale-up evidence-based practice. J Clin Child Adolesc Psychol. 2014;43(6):915–28. https://doi.org/10.1080/15374416.2013.876642.

    Article  PubMed  PubMed Central  Google Scholar 

  5. 5.

    Moullin JC, Dickson KS, Stadnick N, Rabin B, Aarons GA. Systematic review of the exploration, preparation, implementation, sustainment (EPIS) framework. Implement Sci. 2019;14(1):1. https://doi.org/10.1186/s13012-018-0842-6.

    Article  PubMed  PubMed Central  Google Scholar 

  6. 6.

    Bruns EJ, Parker EM, Hensley S, Pullmann MD, Benjamin PH, Lyon AR, Hoagwood KE. The role of the outer setting in implementation: associations between state demographic, fiscal, and policy factors and use of evidence-based treatments in mental healthcare. Implement Sci. 2019;14(1):96. https://doi.org/10.1186/s13012-019-0944-9.

    Article  PubMed  PubMed Central  Google Scholar 

  7. 7.

    Novins DK, Green AG, Legha RK, Aarons GA. Dissemination and implementation of evidence-based practices for child and adolescent mental health: a systematic review. J Am Acad Child Adolesc Psychiatry. 2013;52(10):1009–25. https://doi.org/10.1016/j.jaac.2013.07.012.

    Article  PubMed  PubMed Central  Google Scholar 

  8. 8.

    Lavis JN, Lomas J, Hamid M, Sewankambo NK. Assessing country-level efforts to link research to action. Bull World Health Org. 2006;84(8):620–8. https://doi.org/10.2471/BLT.06.030312.

    Article  PubMed  Google Scholar 

  9. 9.

    Lavis JN, Robertson D, Woodside JM, McLeod CB, Abelson J. How can research organizations more effectively transfer research knowledge to decision makers? Milbank Q. 2003;81(2):221–48. https://doi.org/10.1111/1468-0009.t01-1-00052.

    Article  PubMed  PubMed Central  Google Scholar 

  10. 10.

    Purtle J, Peters R, Brownson RC. A review of policy dissemination and implementation research funded by the National Institutes of Health, 2007–2014. Implement Sci. 2015;11(1):1. https://doi.org/10.1186/s13012-015-0367-1.

    Article  Google Scholar 

  11. 11.

    Purtle J, Brownson RC, Proctor EK. Infusing science into politics and policy: the importance of legislators as an audience in mental health policy dissemination research. Adm Policy Ment Health. 2017;44(2):160–3. https://doi.org/10.1007/s10488-016-0752-3.

    Article  PubMed  Google Scholar 

  12. 12.

    Lengnick-Hall R, Willging C, Hurlburt M, Fenwick K, Aarons GA. Bridging factors linking outer and inner contexts: a longitudinal study of the role of contracting in implementation and sustainment. Implement Sci. 2020;15:43.

    Article  Google Scholar 

  13. 13.

    Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8:117.

    Article  Google Scholar 

  14. 14.

    Shelton RC, Rhoades Cooper B, Wiltsey SS. The sustainability of evidence-based interventions and practices in public health and health care. Annu Rev Public Health. 2018;39(1):55–76. https://doi.org/10.1146/annurev-publhealth-040617-014731.

    Article  PubMed  Google Scholar 

  15. 15.

    Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):50. https://doi.org/10.1186/1748-5908-4-50.

    Article  PubMed  PubMed Central  Google Scholar 

  16. 16.

    Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322–7. https://doi.org/10.2105/AJPH.89.9.1322.

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  17. 17.

    Wandersman A, Duffy J, Flaspohler P, Noonan R, Lubell K, Stillman L, Blachman M, Dunville R, Saul J. Bridging the gap between prevention research and practice: the interactive systems framework for dissemination and implementation. Am J Community Psychol. 2008;41(3-4):171–81. https://doi.org/10.1007/s10464-008-9174-z.

    Article  PubMed  Google Scholar 

  18. 18.

    Powell BJ, Beidas RS, Rubin RM, Stewart RE, Wolk CB, Matlin SL, Weaver S, Hurford MO, Evans AC, Hadley TR, Mandell DS. Applying the policy ecology framework to Philadelphia's behavioral health transformation efforts. Adm Policy Ment Health. 2016;43(6):909–26. https://doi.org/10.1007/s10488-016-0733-6.

    Article  PubMed  PubMed Central  Google Scholar 

  19. 19.

    Hoagwood KE, Olin SS, Horwitz S, McKay M, Cleek A, Gleacher A, Lewandowski E, Nadeem E, Acri M, Chor KHB, Kuppinger A, Burton G, Weiss D, Frank S, Finnerty M, Bradbury DM, Woodlock KM, Hogan M. Scaling up evidence-based practices for children and families in New York State: toward evidence-based policies on implementation for state mental health systems. J Clin Child Adolesc Psychol. 2014;43(2):145–57. https://doi.org/10.1080/15374416.2013.869749.

    Article  PubMed  PubMed Central  Google Scholar 

  20. 20.

    Raghavan R, Bright CL, Shadoin AL. Toward a policy ecology of implementation of evidence-based practices in public mental health settings. Implement Sci. 2008;3:261.

    Article  Google Scholar 

  21. 21.

    Brookman-Frazee L, Turner S, Gordon J, Myers R, Gist K, Dickson KS, Meza M. Evaluating implementation strategies in a community model to facilitate early identification and treatment of developmental and social-emotional problems in young children in out-of-home placement. Children Youth Serv Rev. 2018;88:504–13.

    Article  Google Scholar 

  22. 22.

    Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, Proctor EK, Kirchner JAE. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(1):21. https://doi.org/10.1186/s13012-015-0209-1.

    Article  PubMed  PubMed Central  Google Scholar 

  23. 23.

    Drahota A, Meza RD, Brikho B, Naaf M, Estabillo JA, Gomez ED, Vejnoska SF, Dufek S, Stahmer AC, Aarons GA. Community-academic partnerships: a systematic review of the state of the literature and recommendations for future research. Milbank Q. 2016;94(1):163–214. https://doi.org/10.1111/1468-0009.12184.

    Article  PubMed  PubMed Central  Google Scholar 

  24. 24.

    Rycroft-Malone J, Wilkinson JE, Burton CR, Andrews G, Ariss S, Baker R, Dopson S, Graham I, Harvey G, Martin G, McCormack BG, Staniszewska S, Thompson C. Implementing health research through academic and clinical partnerships: a realistic evaluation of the Collaborations for Leadership in Applied Health Research and Care (CLAHRC). Implement Sci. 2011;6(1):74. https://doi.org/10.1186/1748-5908-6-74.

    Article  PubMed  PubMed Central  Google Scholar 

  25. 25.

    Meade CD, Menard JM, Luque JS, Martinez-Tyson D, Gwede CK. Creating community-academic partnerships for cancer disparities research and health promotion. Health Promot Pract. 2011;12(3):456–62. https://doi.org/10.1177/1524839909341035.

    Article  PubMed  Google Scholar 

  26. 26.

    Bunger AC, Doogan N, Hanson RF, Birken SA. Advice-seeking during implementation: a network study of clinicians participating in a learning collaborative. Implement Sci. 2018;13(1):101. https://doi.org/10.1186/s13012-018-0797-7.

    Article  PubMed  PubMed Central  Google Scholar 

  27. 27.

    DeSisto CL, Estrich C, Kroelinger CD, Goodman DA, Pliska E, Mackie CN, et al. Using a multi-state learning community as an implementation strategy for immediate postpartum long-acting reversible contraception. Implement Sci. 2017;12(1):138. https://doi.org/10.1186/s13012-017-0674-9.

    Article  PubMed  PubMed Central  Google Scholar 

  28. 28.

    Bunger AC, Hanson RF, Doogan NJ, Powell BJ, Cao Y, Dunn J. Can learning collaboratives support implementation by rewiring professional networks? Adm Policy Ment Health. 2014;43:79–92.

    Article  Google Scholar 

  29. 29.

    Hurlburt M, Aarons GA, Fettes D, Willging C, Gunderson L, Chaffin MJ. Interagency collaborative team model for capacity building to scale-up evidence-based Practice. Child Youth Serv Rev. 2014;39:160–8. https://doi.org/10.1016/j.childyouth.2013.10.005.

    Article  PubMed  PubMed Central  Google Scholar 

  30. 30.

    Rudd BN, Davis M, Beidas RS. Integrating implementation science in clinical research to maximize public health impact: a call for the reporting and alignment of implementation strategy use with implementation outcomes in clinical research. Implement Sci. 2020;15:1–11.

    Article  Google Scholar 

  31. 31.

    Graham AK, Lattie EG, Powell B, Lyon AR, Smith JD, Schueller SM, et al. Improving capacity to implement digital mental health services in health care settings: strategies to address barriers and facilitators. Am Psychol Special Issue. (in press).

  32. 32.

    Aarons GA, Ehrhart MG, Sklar M, Carandang K, Phillips J, Reeder K, et al. Dealing with chaos in organizational implementation research. Paper presented at the 2019 Addiction Health Services Research Conference. Park City: UT.

  33. 33.

    Aarons GA, Ehrhart MG, Moullin JC, Torres EM, Green AE. Testing the leadership and organizational change for implementation (LOCI) intervention in substance abuse treatment: a cluster randomized trial study protocol. Implement Sci. 2017;12(1):29. https://doi.org/10.1186/s13012-017-0562-3.

    Article  PubMed  PubMed Central  Google Scholar 

  34. 34.

    Graham AK, Lattie EG, Powell BJ, Lyon AR, Smith JD, Schueller SM, Stadnick NA, Brown CH, Mohr DC. Implementation strategies for digital mental health interventions in health care settings. Am. Psychol. 2020;75:1080–92. https://doi.org/10.1037/amp0000686.

  35. 35.

    Williams NJ. Multilevel mechanisms of implementation strategies in mental health: integrating theory, research, and practice. Adm Policy Ment Health. 2016;43(5):783–98. https://doi.org/10.1007/s10488-015-0693-2.

    Article  PubMed  PubMed Central  Google Scholar 

  36. 36.

    Scott WR, Davis GF. Organizations and organizing: rational, natural, and open systems perspectives. Upper Saddle River: Pearson Education, Inc; 2007.

    Google Scholar 

  37. 37.

    Braithwaite J, Churruca K, Long JC, Ellis LA, Herkes J. When complexity science meets implementation science: a theoretical and empirical analysis of systems change. BMC Med. 2018;16(1):63. https://doi.org/10.1186/s12916-018-1057-z.

    Article  PubMed  PubMed Central  Google Scholar 

  38. 38.

    Siegenfeld AF, Bar-Yam Y. An introduction to complex systems science and its applications. Complexity. 2020;2020(Article ID 6105872):1–16. https://doi.org/10.1155/2020/6105872.

  39. 39.

    Birken SA, Bunger AC, Powell BJ, Turner K, Clary AS, Klaman SL, Yu Y, Whitaker DJ, Self SR, Rostad WL, Chatham JRS, Kirk MA, Shea CM, Haines E, Weiner BJ. Organizational theory for dissemination and implementation research. Implement Sci. 2017;12(1):62. https://doi.org/10.1186/s13012-017-0592-x.

    Article  PubMed  PubMed Central  Google Scholar 

  40. 40.

    Williamson OE. The economics of organization: the transaction cost approach. Am J Sociol. 1981;87(3):548–77. https://doi.org/10.1086/227496.

    Article  Google Scholar 

  41. 41.

    DiMaggio PJ, Powell WW. The iron cage revisited—institutional isomorphism and collective rationality in organizational fields. Am Sociol Rev. 1983;48(2):147–60. https://doi.org/10.2307/2095101.

    Article  Google Scholar 

  42. 42.

    Donaldson L. The contingency theory of organizations, Neo-Contingency Theory. Thousand Oaks: Sage; 2001. p. 245–89.

    Book  Google Scholar 

  43. 43.

    Pfeffer J, Salancik G. The external control of organizations: a resource dependence perspective. New York: Harper & Row; 1978.

    Google Scholar 

  44. 44.

    Lawrence TB, Hardy C. Building bridges for refugees: toward a typology of bridging organizations. J Appl Behav Sci. 1999;35(1):48–70. https://doi.org/10.1177/0021886399351006.

    Article  Google Scholar 

  45. 45.

    Berkes F. Evolution of co-management: role of knowledge generation, bridging organizations and social learning. J. Environ Manag. 2009;90(5):1692–702. https://doi.org/10.1016/j.jenvman.2008.12.001.

    Article  Google Scholar 

  46. 46.

    Brown LD. Bridging organizations and sustainable development. Hum Relat. 1991;44(8):807–31. https://doi.org/10.1177/001872679104400804.

    Article  Google Scholar 

  47. 47.

    Proctor E, Hooley C, Morse A, McCrary S, Kim H, Kohl P. Intermediary/purveyor organizations for evidence-based interventions in the US child mental health: characteristics and implementation strategies. Implement Sci. 2019;14:3.

    Article  Google Scholar 

  48. 48.

    Long JC, Cunningham FC, Braithwaite J. Bridges, brokers and boundary spanners in collaborative networks: a systematic review. BMC Health Serv Res. 2013;13(1):158. https://doi.org/10.1186/1472-6963-13-158.

    Article  PubMed  PubMed Central  Google Scholar 

  49. 49.

    Burt RS. Structural holes: the social structure of competition. Cambridge: Harvard University Press; 1992. https://doi.org/10.4159/9780674029095.

    Book  Google Scholar 

  50. 50.

    Burt RS. Brokerage and closure: an introduction to social capital. New York: Oxford University Press; 2005.

    Google Scholar 

  51. 51.

    Aldrich H, Herker D. Boundary spanning roles and organization structure. Acad Manag Rev. 1977;2(2):217–30. https://doi.org/10.5465/amr.1977.4409044.

    Article  Google Scholar 

  52. 52.

    Olabisi J, Lewis K. Within- and between- team coordination via transactive memory systems and boundary spanning. Group Organ Manag. 2018;43(5):691–717. https://doi.org/10.1177/1059601118793750.

    Article  Google Scholar 

  53. 53.

    Marrone JA. Team boundary spanning: a multilevel review of past research and proposals for the future. J Manag. 2010;36:911–40.

    Google Scholar 

  54. 54.

    Cranley LA, Keefe JM, Taylor D, Thompson G, Beacom AM, Squires JE, Estabrooks CA, Dearing JW, Norton PG, Berta WB. Understanding professional advice networks in long-term care: an outside-inside view of best practice pathways for diffusion. Implement Sci. 2019;14(1):10. https://doi.org/10.1186/s13012-019-0858-6.

    Article  PubMed  PubMed Central  Google Scholar 

  55. 55.

    Long JC, Cunningham FC, Wiley J, Carswell P, Braithwaite J. Leadership in complex networks: the importance of network position and strategic action in a translational cancer research network. Implement Sci. 2013;8(1):122. https://doi.org/10.1186/1748-5908-8-122.

    Article  PubMed  PubMed Central  Google Scholar 

  56. 56.

    Stake RE. Multiple case study analysis. New York: The Guilford Press; 2006.

    Google Scholar 

  57. 57.

    Padgett DK. Qualitative methods in social work research. 3rd ed. Los Angeles: Sage Publications, Inc; 2017.

    Google Scholar 

  58. 58.

    Miles MB, Huberman AM, Saldaña J. Qualitative data analysis: a methods sourcebook. Thousand Oaks: Sage Publications Inc.; 2020.

    Google Scholar 

  59. 59.

    Hawe P. Lessons from complex interventions to improve health. Annu Rev Public Health. 2015;36(1):307–23. https://doi.org/10.1146/annurev-publhealth-031912-114421.

    Article  PubMed  Google Scholar 

  60. 60.

    Hawe P, Shiell A, Riley T. Complex interventions: how out of control can a randomised controlled trial be? BMJ. 2004;328(7455):1561–3. https://doi.org/10.1136/bmj.328.7455.1561.

    Article  PubMed  PubMed Central  Google Scholar 

  61. 61.

    Perez Jolles M, Lengnick-Hall R, Mittman B. Core functions and forms of complex health interventions: a patient-centered medical home illustration. J Gen Intern Med. 2019;34(6):1032–8. https://doi.org/10.1007/s11606-018-4818-7.

    Article  PubMed  PubMed Central  Google Scholar 

  62. 62.

    Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10(1):53. https://doi.org/10.1186/s13012-015-0242-0.

    Article  PubMed  PubMed Central  Google Scholar 

  63. 63.

    Lewis CC, Boyd MR, Walsh-Bailey C, Lyon AR, Beidas R, Mittman B, Aarons GA, Weiner BJ, Chambers DA. A systematic review of empirical studies examining mechanisms of implementation in health. Implement Sci. 2020;15(1):21. https://doi.org/10.1186/s13012-020-00983-3.

    Article  PubMed  PubMed Central  Google Scholar 

  64. 64.

    McHugh S, Dorsey CN, Mettert K, Purtle J, Bruns E, Lewis CC. Measures of outer setting constructs for implementation research: a systematic review and analysis of psychometric quality. Implement Res Pract. 2020;1:1–20.

    Google Scholar 

  65. 65.

    Clinton-McHarg T, Yoong SL, Tzelepis F, Regan T, Fielding A, Skelton E, Kingsland M, Ooi JY, Wolfenden L. Psychometric properties of implementation measures for public health and community settings and mapping of constructs against the consolidated framework for implementation research: a systematic review. Implementation Sci. 2016;11(1):148. https://doi.org/10.1186/s13012-016-0512-5.

    Article  Google Scholar 

  66. 66.

    Kaplan HC, Brady PW, Dritz MC, Hooper DK, Linam WM, Froehle CM, et al. The influence of context on quality improvement success in health care: a systematic review of the literature. Milbank Q. 2010;88(4):500–59. https://doi.org/10.1111/j.1468-0009.2010.00611.x.

    Article  PubMed  PubMed Central  Google Scholar 

  67. 67.

    Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8(1):139. https://doi.org/10.1186/1748-5908-8-139.

    Article  PubMed  PubMed Central  Google Scholar 

  68. 68.

    Suhrheinrich J, Schetter P, England A, Melgarejo M, Nahmias AS, Dean M, et al. Statewide interagency collaboration to support evidence-based practice scale up: the California autism professional training and information network (CAPTAIN). Evid Based Pract Child Adolesc Ment Health. 2020;5:468–82 (in press).

    Article  Google Scholar 

  69. 69.

    Moullin J, Dickson KS, Stadnick N, Albers B, Nilsen P, Broder-Fingert S, et al. Ten recommendations for using implementation frameworks in research and practice. Implement Sci Commun. 2020;1(1):42. https://doi.org/10.1186/s43058-020-00023-7.

    Article  PubMed  PubMed Central  Google Scholar 

  70. 70.

    Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38(2):65–76. https://doi.org/10.1007/s10488-010-0319-7.

    Article  PubMed  Google Scholar 

  71. 71.

    Brookman-Frazee L, Stadnick N, Roesch S, Regan J, Barnett M, Bando L, Innes-Gomberg D, Lau A. Measuring sustainment of multiple practices fiscally mandated in children’s mental health services. Adm Policy Ment Health. 2016;43(6):1009–22. https://doi.org/10.1007/s10488-016-0731-8.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

The authors wish to thank Alicia Bunger, the CAPTAIN Network, Geoffrey Curran, Ann England, Michael Hurlburt, Lisa Saldana, Patricia Schetter, J.D. Smith, Mitchell Sarkies, Jessica Suhrheinrich, and Melissa Zielinski for their helpful insight and feedback on the case studies.

Funding

This work was supported by grants from the National Institute of Mental Health (T32MH019960; Lengnick-Hall; K23MH110602; PI: Stadnick; K23MH115100; PI: Dickson; R01DA049891, R01DA038466, R03MH117493; PI: Aarons) and Medical Research Future Fund (1168155 PI: Moullin). Additionally, Drs. Lengnick-Hall, Stadnick, and Dickson are fellows and Dr. Aarons is core faculty with the Implementation Research Institute (IRI), at the George Warren Brown School of Social Work, Washington University in St. Louis; through an award from the National Institute of Mental Health (R25 MH080916–08). The funding bodies played no role in the design of the study and collection, analysis, and interpretation of data and in writing the manuscript.

Author information

Affiliations

Authors

Contributions

RLH, NAS, KDS, JCM, and GAA conceived of the study design, developed the structure of the manuscript, and participated in the analysis. RLH developed the case study template and led manuscript development. RLH drafted the manuscript and RLH, NAS, KDS, JCM, and GAA wrote and revised case studies and manuscript sections. All authors reviewed and revised several iterations of the manuscript and approved the final version.

Corresponding author

Correspondence to Gregory A. Aarons.

Ethics declarations

Ethics approval and consent to participate

Ethics approval and consent to participate was obtained for individual projects described in the case studies, when appropriate. Ethics approval and consent are not applicable for the current manuscript because we did not collect any human subject data.

Consent for publication

Not applicable.

Competing interests

GAA is an associate editor of Implementation Science. All decisions on this paper were made by another editor. The authors declare no other competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Lengnick-Hall, R., Stadnick, N.A., Dickson, K.S. et al. Forms and functions of bridging factors: specifying the dynamic links between outer and inner contexts during implementation and sustainment. Implementation Sci 16, 34 (2021). https://doi.org/10.1186/s13012-021-01099-y

Download citation

Keywords

  • Bridging factors
  • EPIS framework
  • Outer and inner context
  • Forms
  • Functions
  • Implementation, sustainment, barriers and enablers, evidence-based practice