Understanding the implementation of evidence-informed policies and practices from a policy perspective: a critical interpretive synthesis
Implementation Science volume 16, Article number: 18 (2021)
The fields of implementation science and knowledge translation have evolved somewhat independently from the field of policy implementation research, despite calls for better integration. As a result, implementation theory and empirical work do not often reflect the implementation experience from a policy lens nor benefit from the scholarship in all three fields. This means policymakers, researchers, and practitioners may find it challenging to draw from theory that adequately reflects their implementation efforts.
We developed an integrated theoretical framework of the implementation process from a policy perspective by combining findings from these fields using the critical interpretive synthesis method. We began with the compass question: How is policy currently described in implementation theory and processes and what aspects of policy are important for implementation success? We then searched 12 databases as well as gray literature and supplemented these documents with other sources to fill conceptual gaps. Using a grounded and interpretive approach to analysis, we built the framework constructs, drawing largely from the theoretical literature and then tested and refined the framework using empirical literature.
A total of 11,434 documents were retrieved and assessed for eligibility and 35 additional documents were identified through other sources. Eighty-six unique documents were ultimately included in the analysis. Our findings indicate that policy is described as (1) the context, (2) a focusing lens, (3) the innovation itself, (4) a lever of influence, (5) an enabler/facilitator or barrier, or (6) an outcome. Policy actors were also identified as important participants or leaders of implementation. Our analysis led to the development of a two-part conceptual framework, including process and determinant components.
This framework begins to bridge the divide between disciplines and provides a new perspective about implementation processes at the systems level. It offers researchers, policymakers, and implementers a new way of thinking about implementation that better integrates policy considerations and can be used for planning or evaluating implementation efforts.
Implementation has captured the attention of public policy scholars for well over 50 years , yet remains relatively under-studied compared to other stages of policy-making. The reasons for this are many and include challenges with isolating implementation from other parts of the policy process and a lack of agreement about conceptual underpinnings . This then leads to challenges in identifying relevant explanatory variables and analysts often must resort to a “long list of variables that are potentially useful” . Even once decisions regarding these challenges have been made, the complex, multi-level, and multi-faceted nature of implementation creates difficulties designing and conducting high-quality empirical research that can offer useful generalizations to those interested in improving the process of implementation and thus achieving better policy results .
Research on implementation has also independently come into sharp focus through the related fields of knowledge translation and implementation science. Conceptual work on implementation from these fields has increased at a seemingly exponential rate to the point where there is a great deal of focus on sorting and classifying the many frameworks, models, and theories and providing guidance toward their use [3,4,5,6]. The empirical literature is also rapidly increasing, with over 6200 systematic reviews on consumer-targeted, provider-targeted, and organization-targeted implementation strategies in the health field alone (based on a search of www.healthsystemsevidence.org).
Despite the large number of models, theories, and frameworks being generated in the knowledge translation and implementation science fields, the role of policy in the implementation process appears to be under-theorized. When policy is included in conceptual work, it is often identified as a contextual variable [7, 8] rather than being central to the implementation concept itself. It is also often presented as a broad category of “policy”, rather than as a variable that is specific and therefore measurable in empirical work. This lack of conceptual clarity and empirical work about policy and other policy-related structural constructs has been noted by several researchers. For example, a systematic review of measures assessing the constructs affecting implementation of health innovations makes specific reference to the “relatively few” measures available to assess structural constructs, which they define as “political norms, policies and relative resources/socio-economic status” . As a result, the field of public policy appears to have on the one hand, a challenge of too many policy-related implementation variables, and on the other hand, the fields of knowledge translation and implementation appear to have too few.
In recent years some researchers have recognized these silos in scholarship and have called for more implementation research that integrates public policy and implementation science and knowledge translation perspectives . For example, Johansson concludes that implementation problems could be better understood through the inclusion of research in public administration, with more focus on issues such as resource allocation, priorities, ethical considerations, and the distribution of power between actors and organizational boundaries .
In addition to these challenges, much of the seminal policy scholarship on implementation from both the public policy and knowledge translation and implementation literatures come from the USA [12,13,14,15]. This has resulted in a concentration of theoretical and empirical works that reflect the governance, financial and delivery arrangements that are particular to the USA [16, 17] and that may not always readily apply in other contexts. These differences are particularly marked when it comes to the policy domain of health given the differences of the US system compared to most others . One notable exception to this is the European contributions to the “second generation” of policy scholarship on implementation, which adopted the perspective of those at the “coal face” of policy implementation .
In response to these challenges, the objective of our study was to develop an integrated theoretical framework of the implementation process from a policy perspective by combining findings from the public policy, implementation science, and knowledge translation fields. By integrating knowledge from these fields using a critical interpretive synthesis approach, we specifically examine how policy considerations are described in implementation theories, frameworks, and processes from existing published and gray literature. Our goal was to generate a theoretical framework to foster an improved understanding of the policy contributions to implementation that can be used in future studies to generate testable hypotheses about large-scale system implementation efforts.
Given the broad goal of this study, the question of interest, and the scope of potentially applicable literature from discrete fields that could inform this work, we selected a critical interpretive synthesis (CIS) approach. Drawing from the techniques of meta-ethnography combined with traditional systematic review processes, CIS employs an inductive and interpretive technique to critically inspect the literature and develop a new conceptualization of the phenomenon of interest. Unlike traditional systematic reviews that often focus on questions of effectiveness, CIS is helpful in generating mid-range theories with strong explanatory power [20, 21]. This is suitable for our goal of developing a conceptual framework that better integrates findings from diverse fields and affords the opportunity to critically inspect both individual studies and the literature from each field as a whole in terms of the nature of the assumptions underlying each field, and what has influenced their proposed solution . The method begins with a compass question, which evolves throughout the course of the review [22, 23]. Our compass question was as follows: How is policy currently described in implementation theory and processes and what aspects of policy are important for implementation success?
Our review casts a very broad net in terms of implementation processes and theories. While our main focus is on large-scale implementation efforts in health, behavioral health, and human services areas that are not specific to a particular condition, we also drew from other large-scale implementation theories and empirical work, such as from the field of environmental science, that may yield important insights toward a more integrated framework of implementation. We drew from two key sources of literature: (1) existing frameworks, models, and theories (public policy, implementation science and knowledge translation) and (2) empirical studies that report on specific implementation processes.
Given our interest in implementation processes from a policy perspective, we limited our review to implementation frameworks, models, theories and empirical reports that describe implementation efforts at a community or systems level (e.g., city, province/state or country) where policy considerations are most likely to be an important factor. Implementation of a single evidence-based practice (unless across a large-scale) or implementation in a single organization were excluded, as was research that focused on behavior change at the individual level.
Electronic search strategy
Using the compass question, and in consultation with a librarian, we constructed a table of Boolean-linked key words and then tested several search strategies (Table 1). The search was then conducted in October 2020 for the time period of January 2000–September 2020 using the following 12 databases: ASSIA, CINAHL (via EBSCO), EMBASE (via Ovid), ERIC, Health Star (via Ovid), MEDLINE (via Ovid), PAIS Index, PolSci, PsychINFO, Social Sciences Abstracts, Social Services Abstracts, and Web of Science. The dates for the policy databases (PolSci and Social Sciences Abstracts) were extended to 1973 to ensure key conceptual articles would be retrieved, such as the seminal work by Sabatier and Mazmanian in 1980 . A gray literature search was also conducted using Health Systems Evidence (which indexes policy documents related to health system arrangements and implementation strategies, as well as systematic reviews). Similar search strings were used across all databases with minor adjustments to ensure searches were optimized. We prioritized sensitivity (comprehensiveness) over specificity (precision) in our search strategy.
We excluded articles based on their titles and abstracts if they did not fit within the study scope or if they were not conceptual or empirical works. We created additional inclusion criteria that were based on the following questions: (1) Is there a moderate (or greater) chance that the article will shed light on the role of policy in an implementation process or on the outcomes of the process? (2) Does the article describe implementation efforts at a community or systems level? And (3) does the article identify actors at the government, organizational or practice level such as policy entrepreneurs who may be central to policy implementation efforts? Any articles that did not meet at least one of these criteria were excluded.
Complementary to the formal search and in keeping with the inductive strategies that are part of the CIS process, we also conducted hand searches of the reference lists of relevant publications and searched the authors’ personal files to identify further articles and theoretically sampled additional articles to fill conceptual gaps as the analysis proceeded.
After completing the searches, an Endnote database was created to store and manage results. Once duplicates were removed, a random selection of two percent of the articles was independently screened by two reviewers (H.B. and A.M.) who were blinded to each other’s ratings and used the same inclusion criteria. The reviewers classified each title and abstract as “include”, “exclude”, or “uncertain”. Inter-rater agreement was determined using the kappa statistic. This process was undertaken to improve the methodological rigor by enhancing trustworthiness and stimulating reflexivity, not to establish a quantitative assessment per se . Any discrepancies were then discussed between reviewers until consensus was reached. Next, one reviewer assessed the remaining titles and abstracts. Articles classified as “include” or “uncertain” were kept for full text review.
The full text of the remaining articles was then assessed by one reviewer. Articles were excluded at this stage if they did not provide detailed insight into the compass question. Articles were also sorted according to whether they were a conceptual contribution (i.e., presented a model, theory, framework or theoretical concept on implementation) or an empirical contribution (i.e., used qualitative, quantitative, review or other research methods to present new findings, or an analysis of implementation).
Data analysis and synthesis
Our data analysis proceeded in four stages. First, while screening and assessing the articles for inclusion, we noted some general observations of how policy was incorporated in the literature from each field of interest (policy/public administration, implementation, and knowledge translation). Second, we classified articles according to how policy was portrayed in implementation theory and processes. Third, we constructed a data extraction template for conceptual and empirical studies that included (1) descriptive categories (the author(s), the name of the model, theory or framework (if provided), year of publication, author location, focus of the article, and whether a graphic or visual aid was included), (2) content from the article that addressed the compass question regarding how policy is portrayed and what aspects are important for success, and (3) interpretive categories including “synthetic constructs” developed by the review team from the article and additional notes on how the article contributed to the development of the conceptual model. Additionally, the data extraction form for the conceptual articles included a classification of the type of framework according to Nilsen’s taxonomy of implementation models, theories, and frameworks .
In the fourth and final stage, we initially focused on the conceptual literature and used it as a base from which to build our integrated conceptual model. We developed the synthetic constructs by reviewing the content from each article that addressed the compass question and interpreting the underlying evidence using a constant comparative method to ensure that the emerging synthetic constructs were grounded in the data, similar to a grounded theory approach . These synthetic constructs were then used to begin to build the conceptual model and an accompanying graphic representation of it. We then critiqued the emerging constructs to identify gaps in the evidence and emerging constructs.
Using this emerging model, we purposively sampled additional conceptual literature to fill the gaps that we identified and to ensure we incorporated as many relevant concepts as possible. We did this by consulting reviews of existing models, theories, and frameworks [2,3,4,5,6] to identify additional relevant concepts not captured by our search strategy and by hand searching the reference sections of some seminal conceptual papers [7, 26]. Once saturation of the conceptual literature was reached, we purposively sampled a subset of the empirical literature and used this subset to “test” the model and add additional detail to the theoretical constructs gleaned from empirical report. We used a similar data extraction template with the exceptions of removing the descriptive category of model or theory name and the interpretive classification using the Nilsen taxonomy , but adding the descriptive category of “methodology”. If our model did not capture findings from the empirical studies, we revised it and re-tested. This process continued until saturation was reached and additional empirical studies yielded no further insights into our model.
The methods reported here are based on a protocol developed prior to initiating the study. The protocol and a note about the four ways that the reported methods differed from the protocol are available upon request.
Search results and article selection
Our database search retrieved 16,107 documents and 11,434 unique documents once duplicates were removed. The review of titles and abstracts was completed independently by two reviewers on a random sample (n = 171) of the documents. The Kappa score was 0.72 indicating substantial agreement. Figure 1 provides a flow diagram outlining the search strategy. Following these criteria for the remaining titles and abstracts resulted in 1208 documents included for full text review. The full text review excluded an additional 940 documents leaving 268 potentially relevant documents (excluded documents and the rationale for exclusion are available upon request). Of these, 23 conceptual documents, 243 empirical documents, and two documents that included both conceptual and empirical elements were included for the data extraction and analysis phase. We sampled and extracted data on all of the conceptual articles. For the empirical articles, we chose a maximum variation sampling approach based on the subject matter and article topic with an initial sample of 10% of the articles. We also noted that nine of the articles related to a large, multi-year national implementation study [27,28,29,30,31,32,33,34,35]. Because this was the largest and most comprehensive account of the role of policy in large-scale implementation efforts identified through our search, we included these as a sub-group for data extraction. This approach led to data extraction for 34 empirical articles.
In addition to these two approaches, we sampled articles that filled in conceptual gaps as our model developed. This process resulted in the retrieval of an additional 26 conceptual articles and 3 empirical articles. In total, 86 unique documents were included with two of these documents used in both the conceptual and empirical data extraction (Tables 2 and 3). While our process was inclusive of English language publications from any country, the majority of articles were conducted by US researchers (n = 57), with the others coming mainly from other Western countries (the UK (n = 8), the Netherlands (n = 7), Australia (n = 5), Canada (n = 2), Sweden (n = 2), Germany (n = 2), and Europe, China, and OECD (n = 1)). Articles covered a range of topics including health and health care, public health, mental health and addictions, children and youth, social care, justice, and climate change, among others. The conceptual documents included all of the categories of theories, models and frameworks identified by Nilsen , with the Determinants Framework type being most common. The empirical articles employed a wide array of methods that fall into the broad categories of qualitative, quantitative, and mixed methods.
Through this process, we noted several general observations regarding the characteristics of existing literature. In terms of the scholarly disciplines, most of the implementation science literature focused on the organizational or service provider levels with an emphasis on changing practice, often by introducing an evidence-informed policy or practice (EIPP). The knowledge translation literature included policymakers as a target audience for research evidence, but the focus was on the agenda setting or policy formulation stages of the policy cycle, as opposed to the implementation of an EIPP. Here, the scholarship focused on strategies to increase the use of evidence in policy decision-making. The public policy literature included theory describing “top-down”, “bottom-up”, and integrated approaches to implementing an EIPP. The object of implementation in this area was the policy itself, rather than a specific program or practice. There was often no clear articulation of independent and dependent policy-related implementation variables across any field, although many articles did partially address this.
How policy is described in implementation theory and processes
Our coding based on the compass question resulted in the following characterization of how policy is described in implementation theory and processes:
Policy is described as follows:
Context in which implementation occurs (i.e., only briefly citing a policy as the reason for implementation)
Focusing lens, signaling to systems what the priorities should be (i.e., referring to policy statements or attention by policymakers as a signal about what is important to prioritize)
Innovation itself—the implementation object (i.e., the “thing” being implemented is policy, such as new legislative policy on tobacco cessation)
Lever of influence in the implementation process (i.e., policy is identified as at least one of the factors influencing the implementation process)
Enabler/facilitator or barrier to implementation (moderating variable) (i.e., while policy is identified as being external to the implementation effort, it is later found to be a barrier or facilitator to implementation)
Outcome—the success of the implementation process is at least partially defined and measured by a change in policy.
Policy actors as important participants or leaders in implementation
Our approach to developing the theoretical framework was twofold. The findings from our analysis suggested constructs that addressed both the process of implementation and the factors underpinning the success or failure of implementation. We therefore first developed a process model  that describes the steps in the process of translating EIPPs into effectively embedded system changes. Next, we developed a determinants framework, which specifies the types of policy determinants (independent variables) that affect implementation outcomes (dependent variables). This two-part theoretical framework achieves two goals: (1) the process model is most useful in describing the process of implementation from a policy perspective and (2) the determinants framework is most useful for understanding and explaining policy-related influences on implementation outcomes.
Part 1—process model
Figure 2 depicts this novel process model focusing on one policy or system level. What follows is a narrative description of the model.
Policy is shaped as it moves through systems. The process through which policy travels from one level to another is known as policy transfer [36, 45, 46]. Each policy level is nested in a context that includes existing ideas (values, evidence, etc.), interests (interest groups, civil society, etc.), institutions (existing rules and institutional structures), and external factors (natural disaster, change in economic conditions) that affect the interpretation of the policy package [107, 108]. This context affects how a problem is defined, whether it has the attention of decision makers and whether it is up for active decision-making. This aligns with the “problem definition” and “agenda setting” stages of the policy cycle but is also described as part of the “exploration phase” in implementation science [12, 109]. Once a decision has been reached that something should be done to address a given issue, attention shifts to the “policy development” stage of the policy cycle, which aligns with the “adoption decision and preparation” stage of implementation. It is during the policy development/adoption decision and preparation stage that the policy package gets developed.
A policy package usually includes a mix of policy levers or instruments, including legal and regulatory instruments, economic instruments, voluntary instruments, or information and education instruments [58, 110]. The policy package can also include some implementation guidance such as a description of the overall implementation strategy architecture, the major streams of activity, timing of events and milestones, and roles and responsibilities.
The level of ambiguity of the policy package in terms of its goals and means of attaining them, and the amount of conflict among actors with respect to the policy package are important to help characterize the implementation process and to explain its outcomes. According to Matland  the consideration of ambiguity and conflict leads to four types of implementation processes: (1) administrative implementation occurs when there is low policy ambiguity and low policy conflict (e.g., eradication of small pox), (2) political implementation occurs when there is low ambiguity but high levels of conflict (e.g., public transit), (3) experimental implementation occurs when there is high ambiguity but low conflict (e.g., Head Start programs for young children), and (4) symbolic implementation occurs when both ambiguity and conflict are high and policies only have a referential goal and differing perspectives on how to translate the abstract goal into instrumental actions (e.g., establishing youth employment agencies).
The policy implementation process can start at any level, move in any direction and can “skip” levels. Power also shifts as implementation proceeds through levels [29, 56]. The level with the most implementation activity tends to have the most power. This is true not only for different levels of governance, but as implementation cascades across organizations, through “street level bureaucrats”  and on to the end-user or target population (the “recipient”) of the implementation process. Policy decisions at one level become context for other levels. Implementation activities at one level can exert either direct or indirect effects on another level. The context surrounding each level (prevailing ideas, interests, institutions, and external events) influences the acceptability and ultimate success of implementation. Finally, the overall implementation approach may need to shift over time in response to a constantly evolving context. For example, one study found it necessary to change the implementation approach for a road safety program in respond to changes in policy authority .
The process of implementation is undertaken in order to lead to outcomes, which can be separated and measured at different levels. Proctor et al. e identifies three separate outcomes: (1) implementation outcomes, (2) service outcomes, and (3) recipient-related outcomes. Along with these outcomes, our model includes policy- and systems-level outcomes. These can be evaluated according to the policy outputs (i.e., enforcement variables, change of perspective of street-level staff), policy outcomes (i.e., unemployment levels, life-expectancy of population) or indices of policy system change (i.e., administrative re-organization, privatization) . While the measures and levels will vary depending on the size, scale, and focus of implementation, there is broad agreement that outcomes should be clearly defined a priori and precisely measured. Evaluation findings regarding outputs and outcomes can dynamically feed back into the implementation process as it unfolds. This creates feedback loops and the process becomes very dynamic and multi-directional.
Part 2—determinants framework
Figure 3 presents an overview of our determinants framework and the relationship among the determinants. Our findings point to three sets of policy-related factors that affect the process, outputs, and outcomes of implementation: (1) policy instruments and strategies, (2) determinants of implementation, and (3) policy actors, including their characteristics, relationships, and context. Collectively, these feed into the process of implementation that proceeds in an iterative fashion along the stages: exploration, installation/preparation, initial implementation, and full implementation/sustainment [12, 109]. The types of policy influences vary according to the stage of implementation . The process of implementation leads to a variety of outputs and outcomes as described above.
Policy instruments and strategies
Policy instruments and strategies are the most common set of factors mentioned in the literature and we found evidence for each of the instrument types described here, although with varying levels of detail. Policy instruments can be applied to implementation in differing ways, often with two or three levers used concurrently to implement a single initiative or strategy . In order to classify these strategies in a meaningful way, we drew on and adapted elements of a mutually exclusive and collectively exhaustive framework that identifies key features of health and social systems  and honed in on strategies that are particularly important for implementation (Table 4). These include strategies focused on the governance arrangements, financial arrangements, service delivery arrangements, and implementation-related supports in systems. We then divided these strategies according to the intended “target” of implementation. Common targets of implementation from a policy perspective include the whole system, organizations, the workforce or service providers, consumers, and the innovation itself (the EIPP to be implemented). We wish to note, however, that because policy-related variables have not necessarily been treated with the same specificity as other types of implementation variables, the most common strategies do not reflect the full array of strategies that could be employed.
Our framework identifies eight categories of determinants (see “Determinants” box and elsewhere in Fig. 3). Each of these categories represents a suite of factors that are hypothesized to independently affect implementation outcomes. These determinants are described briefly below and in more detail in Table 5.
I—Characteristics of the evidence-informed policy or practice (EIPP). The success or failure of a particular policy package cannot be evaluated based on its intrinsic characteristics alone . Instead, it is important to examine whether the policy selected is an appropriate “fit” with the problem , well-justified , and aligned with existing context [12, 88].
II—Policy formulation process. This is the shape given to a policy by the initial formation processes . It includes who in government is responsible for formulating the policy, their legitimacy and the extent to which there is opportunity to provide feedback, how much feedback is given, and the responsiveness in terms of adjustments made .
III—Vertical public administration and thickness of hierarchy. Vertical public administration is the term used to identify the layers in the policy transfer process. It refers to separate governments exercising their authority with relative autonomy . Policies generated outside of a socio-political level may be more or less acceptable to that level. Within a given layer, a particular policy area may require the mobilization of any number of institutions, departments, or agencies, and these agencies must act in a coordinated, interdependent fashion, termed “thickness of the hierarchy” .
IV—Networks/inter-organizational relationships. The existence and nature of the relationships between parallel organizations who must collaborative in order to achieve effective implementation and who do not have a hierarchical relationship .
V—Implementing agency responses. The factors affecting the responses of implementing agencies can be divided into issues related to the overall characteristics of the agencies and the behavior of front-line or street-level staff [13, 56].
VI—Attributes and responses from those affected by EIPP. Attributes include the diversity of target group behavior and the target group as a percentage of the population . Responses include thing like impacts on workforce stability .
VII—Timing/sequencing. As implementation is a process that unfolds over time, it does not always align with the cycles to which it is subject and the time constraints inherent therein [86, 87]. Additionally, the external context in which implementation occurs is ever changing and “quintessentially unstable”, and success hinges on the ability to perceive those changes and take the necessary actions to adjust along the way . In Fig. 3, timing/sequencing is placed outside of the determinants box to reflect its importance across all of the other elements.
VIII—External environment or policy context. Much of the literature identified factors outside the policy area of focus that may influence implementation (Fig. 3, outside the hatched line). Many authors referred to this generally as the “political and social climate”, as unmodifiable or macro “context”, or as “socio-economic conditions” [9, 14, 38, 40, 52, 70, 75, 80]. We organized this determinant using (1) the 3I+E framework  and (2) a taxonomy of health and social system arrangements .
Our analysis revealed a wide range of policy actors who are important for implementation. In an attempt to create a category of variables that is analytically useful across contexts, we first divided the types of policy actors into the broad categories of political actors, bureaucratic actors, special interests, and experts . To provide more specificity, we further divided these into a non-exhaustive list of actor sub-types that were frequently mentioned in the literature and included examples of the types of roles they tend to assume in implementation (Table 6). While many of the sub-types are commonly identified in other phases of the policy cycle, some receive particular attention in the implementation literature. These include two types of special interests: (1) implementing agencies, organizations or programs that are responsible for implementing the EIPP (e.g., hospitals, schools), and 2 street-level bureaucrats who, due to the relatively high degree of discretion in their jobs, and therefore discretion over the dispensation of public benefits or sanctions to citizens, can be critical to realizing any large-scale implementation efforts. There are also three expert sub-types that are particularly visible during implementation: (1) field or practice leaders who can be influential in supporting practice change among professionals, (2) innovation developers/disseminators who have developed the EIPP to be implemented and who may contribute or adapt tools and other types of support to encourage successful implementation, and (3) intermediaries/technical assistance providers who are organizations, programs, or individuals that work “in between” policymakers, funders, and front-line implementers, to facilitate effective implementation drawing on expertise in implementation.
There are also three categories of actor-related variables that are important: (1) actor characteristics, (2) actor relationships, and (3) the context in which the actors are embedded (Fig. 4). First, the characteristics of the policy actors (either individual- or organizational-level) such as their knowledge of the implementation context, their legitimacy, power and control, and their leadership in the context of the implementation effort are often cited as being critical to the success in large-scale implementation initiatives. Second, the relationships policy actors have with other actors, such as the level of shared values and beliefs or the coordination and alignment of actors and their activities, can be predictive of successful implementation. Finally, the context of the actors, such as the sustainment of political will and commitment and the stability of the actors themselves can predict the long-term success of implementation.
Our study represents one of the first comprehensive attempts to answer the call of scholars to integrate the fields of implementation science, knowledge translation, and policy implementation in an effort to build a more comprehensive and accurate understanding of implementation. By integrating conceptual and empirical works from all three fields, the resultant two-part theoretical framework provides additional clarity regarding the process of implementation viewed from a policy perspective and identifies a number of policy-related determinants that can be tested empirically in the future.
A key strength of our study was the methodological approach we took to theory building.
First was the comprehensiveness of the search strategy, which aimed to identify scholarship from more than one academic discipline and across wide range of topics beyond health. The literature identified through the search process revealed some interesting parallels and unique differences between the fields that made it clear to us the extent of the lack of integration up to this point. Perhaps not surprisingly, the area of public health seemed to be the most fertile ground for integration. This is likely due to their focus on population-level concerns requiring system-wide implementation of EIPPs and a diverse implementation ecosystem. The search strategy was part of the mixed methods approach of the CIS, which blended the rigor of a systematic search methodology that is explicit and replicable, with the inductive, iterative, and purposive sampling techniques from qualitative review methods to build mid-range theory. The result is a theoretical framework that is clearly linked to the literature, which should instill some confidence in the academic community regarding its grounding. Critical interpretive synthesis is a relatively new approach but is growing in popularity for these reasons.
Despite the merits of our approach, we did identify some challenges. First, we believe the literature from public policy may be underrepresented for several reasons: (1) search terms did not retrieve as much from those fields (it could be that there are terms used more commonly in those fields that would have increased yield), (2) the disciplinary approach to the scholarship in public policy often means the articles were less explicit about methods and this meant that more were excluded as not being “high yield”, and (3) more of that scholarship is captured through other media (e.g., books) and while some of these were included, our approach was not as sensitive to retrieving these types of documents. We also did not include all of the empirical articles for data extraction and we may have missed a key theme or framework component. While we believe this is unlikely because we continued to sample until saturation was reached, it is still possible something was missed. Finally, there were few documents from low- and middle-income countries included in the final sample. Specific efforts to include relevant documents from LMICs in future may enrich and refine the model.
As a result of this research, policymakers and practitioners looking to use a conceptual model to guide their implementation activities have two additional options that they can be confident draw from existing theory and empirical works. Large-scale implementation endeavors or those that have started small and are looking to scale-up should at least be mindful of the critical roles of policy during the process and what policy-related factors may be important for success. Those planning implementation activities can consider the elements presented in the framework as factors that may require consideration and adjustment prior to implementing something new. Our work supports thinking beyond the program or practice levels and unpacks policy considerations that may have influence on, or affect the effectiveness of, a program or practice. Furthermore, the inclusion of policy-related outputs and outcomes in our framework offers policymakers and practitioners the option of additional indicators of success on which they can measure and report.
Like any new theoretical contribution, our framework would benefit from further refinement and testing by the research community. Future research could adopt the process model to guide a policy-intensive implementation effort and test it to determine its usefulness in such efforts. Researchers could also select particular framework elements and unpack them further for additional precision and clarity, drawing from multiple fields of scholarship. Our framework also offers some much-needed policy variables that have been lacking in the implementation science and knowledge translation fields, which could be incorporated as part of a suite of variables in implementation research.
Our study represents an early effort at integrating the fields of public policy, implementation science, and knowledge translation. We have learned that there is indeed a great deal that each of the fields can learn from the other to advance our understanding of policy- and systems-level implementation efforts and hope that these efforts are followed by more interdisciplinary research in order to truly bridge this divide.
Availability of data and materials
Pressman JL, Wildavsky AB. Implementation. How great expectations in Washington are dashed in Oakland. Berkeley, CA: University of California Press; 1973.
Hill M, Hupe P. Implementing public policy: governance in theory and in practice. 3rd ed. London, UK: SAGE; 2014.
Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10:53.
Moullin JC, Sabater-Hernandez D, Fernandez-Llimos F, Benrimoj SI. A systematic review of implementation frameworks of innovations in healthcare and resulting generic implementation framework. Heal Res Policy Syst. 2015;13:16.
Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43(3):337–50.
Mitchell SA, Fisher CA, Hastings CE, Silverman LB, Wallen GR. A thematic analysis of theoretical models for translational science in nursing: mapping the field. Nurs Outlook. 2010;58(6):287–300.
Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.
Wandersman A, Chien VH, Katz J. Toward an evidence-based system for innovation support for implementing innovations with quality: tools, training, technical assistance, and quality assurance/quality improvement. Am J Community Psychol. 2012;50(3–4):445–59.
Chaudoir SR, Dugan AG, Barr CHI. Measuring factors affecting implementation of health innovations: a systematic review of structural, organizational, provider, patient, and innovation level measures. Implement Sci. 2013;8:22.
Nilsen P, Stahl C, Roback K, Cairney P. Never the twain shall meet?--a comparison of implementation science and policy implementation research. Implement Sci. 2013;8:63.
Johansson S. Implementing evidence-based practices and programmes in the human services: lessons from research in public administration. Eur J Soc Work. 2010;13(1):109–25.
Aarons GA, Hurlburt M, Horwitz SMC. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Heal. 2011 Jan;38(1):4–23.
Lipsky M. Street-level bureaucracy: dilemmas of the individual in public services. New York: Russell Sage Foundation; 1980.
Sabatier P, Mazmanian D. The implementation of public policy: a framework of analysis. Policy Stud J. 1980;8(4):538–60.
Van Meter DS, Van Horn CE. The policy implementation process: a conceptual framework. Adm Soc. 1975;6(4):445–88.
O’Toole LJ Jr. Research on policy implementation: assessment and prospects. J Public Adm Res Theor. 2000;10(2):263–88.
Saetren H. Implementing the third generation research paradigm in policy implementation research: an empirical assessment. Public Policy Adm. 2014;29(2):84–105.
Papanicolas I, Woskie LR, Jha AK. Health care spending in the United States and other high-income countries. JAMA. 2018;319(10):1024–39.
Hjern B. Implementation research—the link gone missing. J Public Policy. 1982;2(3):301–8.
Flemming K. Synthesis of quantitative and qualitative research: an example using Critical Interpretive Synthesis. J Adv Nurs. 2010;66(1):201–17.
Wilson MG, Ellen ME, Lavis JN, Grimshaw JM, Moat KA, Shemer J, et al. Processes, contexts, and rationale for disinvestment: a protocol for a critical interpretive synthesis. Syst Rev. 2014;3(1):143.
Dixon-Woods M, Cavers D, Agarwal S, Annandale E, Arthur A, Harvey J, et al. Conducting a critical interpretive synthesis of the literature on access to healthcare by vulnerable groups. BMC Med Res Methodol. 2006;6(1):35.
Entwistle V, Firnigl D, Ryan M, Francis J, Kinghorn P. Which experiences of health care delivery matter to service users and why? A critical interpretive synthesis and conceptual map. J Health Serv Res Policy. 2012;17(2):70–8.
Patton MQ. Enhancing the quality and credibility of qualitative analysis. Health Serv Res. 1999;34(5 Pt 2):1189.
Charmaz K. Constructing grounded theory: A practical guide through qualitative analysis 2nd ed. London: SAGE; 2014.
Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. The Milbank Quarterly. 2004;82(4):581–629.
Bond GR, Drake RE, McHugo GJ, Rapp CA, Whitley R. Strategies for improving fidelity in the National Evidence-Based Practices Project. Res Soc Work Pract. 2009;19(5):569–81.
Finnerty MT, Rapp CA, Bond GR, Lynde DW, Ganju V, Goldman HH. The State Health Authority Yardstick (SHAY). Community Ment Heal J. 2009;45(3):228–36.
Isett KR, Burnam MA, Coleman-Beattie B, Hyde PS, Morrissey JP, Magnabosco J, et al. The state policy context of implementation issues for evidence-based practices in mental health. Psychiatr Serv. 2007;58(7):914–21.
Isett KR, Burnam MA, Coleman-Beattie B, Hyde PS, Morrissey JP, Magnabosco JL, et al. The role of state mental health authorities in managing change for the implementation of evidence-based practices. Community Ment Heal J. 2008;44(3):195–211.
Jones AM, Bond GR, Peterson AE, Drake RE, McHugo GJ, Williams JR. Role of state mental health leaders in supporting evidence-based practices over time. J Behav Heal Serv Res. 2014;41(3):347–55.
Mancini AD, Moser LL, Whitley R, McHugo GJ, Bond GR, Finnerty MT, et al. Assertive community treatment: facilitators and barriers to implementation in routine mental health settings. Psychiatr Serv. 2009;60(2):189–95.
Peterson AE, Bond GR, Drake RE, McHugo GJ, Jones AM, Williams JR. Predicting the long-term sustainability of evidence-based practices in mental health care: an 8-year longitudinal analysis. J Behav Heal Serv Res. 2014;41(3):337–46.
Rapp CA, Bond GR, Becker DR, Carpinello SE, Nikkel RE, Gintoli G. The role of state mental health authorities in promoting improved client outcomes through evidence-based practice. Community Ment Health J. 2005;41(3):347–63.
Rapp CA, Goscha RJ, Carlson LS. Evidence-based practice implementation in Kansas. Community Ment Health J. 2010;46(5):461–5.
Bauman AE, Nelson DE, Pratt M, Matsudo V, Schoeppe S. Dissemination of physical activity evidence, programs, policies, and surveillance in the international public health arena. Am J Prev Med. 2006;31(4 Suppl):S57–65.
Bowen SAK, Saunders RP, Richter DL, Hussey J, Elder K, Lindley L. Assessing levels of adaptation during implementation of evidence-based interventions: introducing the Rogers-Rutten framework. Heal Educ Behav. 2010;37(6):815–30.
Bowen S, Zwi AB. Pathways to “evidence-informed” policy and practice: a framework for action. PLoS Med. 2005;2(7):600–5.
Bruns EJ, Hoagwood KE, Rivard JC, Wotring J, Marsenich L, Carter B. State implementation of evidence-based practice for youths, part II: Recommendations for research and policy. J Am Acad Child Adolesc Psychiatry. 2008;47(5):499–504.
Burris S, Mays GP, Scutchfield FD, Ibrahim JK. Moving from intersection to integration: public health law research and public health systems and services research. Milbank Q. 2012;90:375–408.
Campos PA, Reich MR. Political analysis for health policy implementation. Heal Syst Reform. 2019;5(3):224–35.
Cherney A, Head B. Supporting the knowledge-to-action process: a systems-thinking approach. Evid Policy. 2011;7(4):471–88.
Chin MH, Goldmann D. Meaningful disparities reduction through research and translation programs. JAMA. 2011;305(4):404–5.
Domitrovich CE, Bradshaw CP, Poduska JM, Hoagwood K, Buckley JA, Olin S, Romanelli LH, Leaf PJ, Greenberg MT, Ialongo NS. Maximizing the implementation quality of evidence-based preventive interventions in schools: A conceptual framework. Adv Sch Ment Health Promot. 2008;1(3):6–28.
Evans M, Davies J. Understanding policy transfer: a multi-level, multi-disciplinary perspective. Public Adm. 1999;77(2):361–85.
Dolowitz DP, Marsh D. Learning from abroad: the role of policy transfer in contemporary policy-making. Governance. 2000;13(1):5–23.
Feldstein AC, Glasgow RE. A practical, robust implementation and sustainability model (PRISM) for integrating research findings into practice. Jt Comm J Qual Patient Saf. 2008;34(4):228–43.
Fleuren M, Wiefferink K, Paulussen T. Determinants of innovation within health care organizations: literature review and Delphi study. Int J Qual Health Care. 2004;16(2):107–23.
Godfrey JL. Re-implementing assertive community treatment: one agency’s challenge of meeting state standards. Diss Abstr Int Sect B Sci Eng. 2011;72(4-B):2434.
Green LW, Orleans CT, Ottoson JM, Cameron R, Pierce JP, Bettinghaus EP. Inferring strategies for disseminating physical activity policies, programs, and practices from the successes of tobacco control. Am J Prev Med. 2006;31(4 Suppl):S66–81.
Greig G, Entwistle VA, Beech N. Addressing complex healthcare problems in diverse settings: insights from activity theory. Soc Sci Med. 2012;74(3):305–12.
Harris JR, Cheadle A, Hannon PA, Forehand M, Lichiello P, Mahoney E, Snyder S, Yarrow J. A framework for disseminating evidence-based health promotion practices. Prev Chronic Dis. 2012;9:E22. Epub 2011 Dec 15. PMID: 22172189; PMCID: PMC3277406.
Harvey G, Kitson A. PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci. 2016;11(1):33.
Hendriks A, Jansen M, Gubbel J, De Vries N, Paulussen T, Kremers S. Proposing a conceptual framework for integrated local public health policy, applied to childhood obesity--the behavior change ball. Implement Sci. 2013;8(1):46.
Hill M, Hupe P. The multi-layer problem in implementation research. Public Manag Rev. 2003;5(4):471–90.
Hill M, Hupe P. Implementing public policy: governance in theory and in practice. 1st ed. London, UK: Sage; 2002.
Hodges S, Ferreira K. A multilevel framework for local policy development and implementation. In: Child and family advocacy: bridging the gaps between research, practice, and policy. New York, NY: Springer Science; 2013. p. 205–15.
Howlett M. Beyond good and evil in policy implementation: Instrument mixes, implementation styles, and second generation theories of policy instrument choice. Policy Soc. 2004;23(2):1–7.
Hupe PL. The thesis of incongruent implementation: revisiting Pressman and Wildavsky. Public Policy Adm. 2011;26(1):63–80.
Hupe PL, Hill MJ. ‘And the rest is implementation. ’Comparing approaches to what happens in policy processes beyond Great Expectations. Public Policy Adm. 2016;31(2):103–21.
Jansen MW, van Oers HA, Kok G, de Vries NK. Public health: disconnections between policy, practice and research. Heal Res Policy Syst. 2010;8(1):37.
Jilcott S, Ammerman A, Sommers J, Glasgow RE. Applying the RE-AIM framework to assess the public health impact of policy change. Ann Behav Med. 2007;34(2):105–14.
Leeman J, Sommers J, Vu M, Jernigan J, Payne G, Thompson D, et al. An evaluation framework for obesity prevention policy interventions. Prev Chronic Dis. 2012;9(6):E120.
Matland RE. Synthesizing the implementation literature: the ambiguity-conflict model of policy implementation. J Public Adm Res Theory. 1995;5(2):145–74.
Mendel P, Meredith LS, Schoenbaum M, Sherbourne CD, Wells KB. Interventions in organizational and community context: a framework for building evidence on dissemination and implementation in health services research. Adm Policy Ment Heal. 2008;35(1–2):21–37.
Michie S, van Stralen MM, West R. The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement Sci. 2011;6(1):42.
Moulton S, Sandfort JR. The strategic action field framework for policy implementation research. Policy Stud J. 2017;45(1):144–69.
Pettigrew A, Whipp R. Managing change and corporate performance. In European industrial restructuring in the 1990s. London: Palgrave Macmillan; 1992. pp. 227–6265.
Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Heal. 2011;38(2):65–76.
Raghavan R, Bright CL, Shadoin AL. Toward a policy ecology of implementation of evidence-based practices in public mental health settings. Implement Sci. 2008;3(1):26.
Rütten A, Lüschen G, von Lengerke T, Abel T, Kannas L, Diaz JAR, et al. Determinants of health policy impact: a theoretical framework for policy analysis. Soz Präventivmed Soc Prev Med. 2003;48(5):293–300.
Schoenwald SK, Chapman JE, Kelleher K, Hoagwood KE, Landsverk J, Stevens J, et al. A survey of the infrastructure for children’s mental health services: implications for the implementation of empirically supported treatments (ESTs). Adm Policy Ment Heal. 2008 Mar;35(1–2):84–97.
Shortell SM. Increasing value: a research agenda for addressing the managerial and organizational challenges facing health care delivery in the United States. Med Care Res Rev. 2004;61(3 Suppl):12S–30S.
Spoth LA, Greenberg M, Leaf P, Brown CH, Fagan A, Catalano RF, et al. Addressing core challenges for the next generation of type 2 translation research and systems: the Translation Science to Population Impact (TSci Impact) framework. Prev Sci. 2013;14(4):319–51.
Strehlenert H, Richter-Sundberg L, Nyström ME, Hasson H. Evidence-informed policy formulation and implementation: a comparative case study of two national policies for improving health and social care in Sweden. Implementation Sci. 2015;10(1):169.
Thomann E, Hupe P, Sager F. Serving many masters: public accountability in private policy implementation. Governance. 2018;31(2):299–319.
Lukas CV, Holmes SK, Cohen AB, Restuccia J, Cramer IE, Shwartz M, et al. Transformational change in health care systems: an organizational model. Health Care Manage Rev. 2007;32(4):309–20.
Viennet R, Pont B. Education policy implementation. OECD Education Working Papers Series. 2017.
Wandersman A, Alia K, Cook BS, Hsu LL, Ramaswamy R. Evidence-based interventions are necessary but not sufficient for achieving outcomes in each setting in a complex world: empowerment evaluation, getting to outcomes, and demonstrating accountability. Am J Eval. 2016;37(4):544–61.
Wisdom JP, Chor KHB, Hoagwood KE, Horwitz SM. Innovation adoption: a review of theories and constructs. Adm Policy Ment Heal. 2014;41(4):480–502.
Bax C, de Jong M, Koppenjan J. Implementing evidence-based policy in a network setting: road safety policy in the Netherlands. Public Adm. 2010;88(3):871–84.
Beidas RS, Stewart RE, Adams DR, Fernandez T, Lustbader S, Powell BJ, et al. A multi-level examination of stakeholder perspectives of implementation of evidence-based practices in a large urban publicly-funded mental health system. Adm Policy Ment Heal. 2016;43(6):893–908.
Brodowski ML, Counts JM, Gillam RJ, Baker L, Collins VS, Winkle E, et al. Translating evidence-based policy to practice: a multilevel partnership using the interactive systems framework. Fam Soc J Contemp Soc Serv. 2013;94(3):141–9.
Brownson RC, Allen P, Jacob RR, Harris JK, Duggan K, Hipp PR, et al. Understanding mis-implementation in public health practice. Am J Prev Med. 2015;48(5):543–51.
Cheadle R, LoGerfo JP, Schwartz S, Harris JRAE. Promoting sustainable community change in support of older adult physical activity: evaluation findings from the Southeast Seattle Senior Physical Activity Network (SESPAN). J Urban Heal. 2009:1–9.
Culotta D, Wiek A, Forrest N. Selecting and coordinating local and regional climate change interventions. Environ Plan C-Government Policy. 2016;34(7):1241–66.
Evans BA, Snooks H, Howson H, Davies M. How hard can it be to include research evidence and evaluation in local health policy implementation? Results from a mixed methods study. Implement Sci. 2013;8.
Fleuren MAH, Paulussen TGWM, Dommelen P, Van Buuren S. Towards a measurement instrument for determinants of innovations. Int J Qual Heal Care. 2014;26(5):501–10.
Gotham HJ, White MK, Bergethon HS, Feeney T, Cho DW, Keehn B. An implementation story: moving the GAIN from pilot project to statewide use. J Psychoactive Drugs. 2008;40(1):97–107.
Grace FC, Meurk CS, Head BW, Hall WD, Carstensen G, Harris MG, et al. An analysis of policy levers used to implement mental health reform in Australia 1992-2012. BMC Health Serv Res. 2015;15(1):479.
Grundy J, Smith M. Evidence and equity: Struggles over federal employment equity policy in Canada, 1984-95. Can Public Adm. 2011;54(3):335–57.
Hargreaves M, Cole R, Coffee-Borden B, Paulsell D, Boller K. Evaluating infrastructure development in complex home visiting systems. Am J Eval. 2013 Jun;34(2):147–69.
Haug C, Rayner T, Jordan A, Hildingsson R, Stripple J, Monni S, et al. Navigating the dilemmas of climate policy in Europe: evidence from policy evaluation studies. Clim Change. 2010;101(3–4):427–45.
Horner RH, Kincaid D, Sugai G, Lewis T, Eber L, Barrett S, et al. Scaling up school-wide positive behavioral interventions and supports: experiences of seven states with documented success. J Posit Behav Interv. 2014;16(4):197–208.
Monroe-DeVita M, Morse G, Bond GR. Program fidelity and beyond: multiple strategies and criteria for ensuring quality of assertive community treatment. Psychiatr Serv. 2012.
Painter K. Legislation of evidence-based treatments in public mental health: analysis of benefits and costs. Soc Work Public Health. 2009;24(6):511–26.
Perla RJ, Bradbury E, Gunther-Murphy C. Large-scale improvement initiatives in healthcare: a scan of the literature. J Healthc Qual. 2013;35(1):30–40.
Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, et al. A compilation of strategies for implementing clinical innovations in health and mental health. Med Care Res Rev. 2012;69(2):123–57.
Powell BJ, Proctor EK, Glass JE. A systematic review of strategies for implementing empirically supported mental health interventions. Res Soc Work Pr. 2014;24(2):192–212.
Powell B, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(1):21.
Rhoades BL, Bumbarger BK, Moore JE. The role of a state-level prevention support system in promoting high-quality implementation and sustainability of evidence-based programs. Am J Community Psychol. 2012;50(3–4):386–401.
Rieckmann AE. Cassidy, Elaine F.; McCarty, Dennis TR. K. Employing policy and purchasing levers to increase the use of evidence-based practices in community-based substance abuse treatment settings: reports from single state authorities. Eval Program Plann. 2011;34(4):366–74.
Rieckmann T, Abraham A, Zwick J, Rasplica C, McCarty D. A longitudinal study of state strategies and policies to accelerate evidence-based practices in the context of systems transformation. Health Serv Res. 2015;50(4):1125–45.
Rubin RMM, Hurford MOO, Hadley T, Matlin S, Weaver S, Evans AC. Synchronizing watches: the challenge of aligning implementation science and public systems. Adm Policy Ment Heal. 2016;43(6):1023–8.
Yamey G. What are the barriers to scaling up health interventions in low and middle income countries? A qualitative study of academic leaders in implementation science. Glob Heal. 2012;8:11.
Zhang Y, Marsh D. Learning by doing: the case of administrative policy transfer in China. Policy Stud. 2016;37(1):35–52.
Lavis JN, Rottingen JA, Bosch-Capblanch X, Atun R, El-Jardali F, Gilson L, et al. Guidance for evidence-informed policies about health systems: linking guidance development to policy development. PLoS Med. 2012;9(3):e1001186.
Shearer JC, Abelson J, Kouyate B, Lavis JN, Walt G. Why do policies change? Institutions, interests, ideas and networks in three cases of policy reform. Heal Policy Plan. 2016;31(9):1200–11.
Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F, Network TNIR. Implementation Research: A Synthesis of the Literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute; 2005.
Treasury Board of Canada Secretariat. Assessing, selecting, and implementing instruments for government action. Ottawa: Governement of Canada; 2007.
VanDeusen LC, Engle R, Holmes S, Parker V, Petzel R, Nealon Seibert M, et al. Strengthening organizations to implement evidence-based clinical practices. Health Care Manage Rev. 2010;35(3):235–45.
Pal LA. Beyond policy analysis: public issue management in turbulent times. 5th ed: Nelson Education; 2014.
Lavis JN. Studying health-care reforms. In: Lazar H, Lavis J, Forest P-G, Church J, editors. Paradigm freeze: why it is so hard to reform health care in Canada. Kingston: McGill-Queen’s University Press; 2013.
Lavis JN, Wilson MG, Moat KA, Hammill AC, Boyko JA, Grimshaw JM, et al. Developing and refining the methods for a “one-stop shop” for research evidence about health systems. Heal Res Policy Syst. 2015;13(1):10.
Dente B. Who Decides? Actors and Their Resources. In: Understanding Policy Decisions. SpringerBriefs in Applied Sciences and Technology. Cham: Springer; 2014. https://doi.org/10.1007/978-3319-02520-9_2
This study was partially supported through a doctoral scholarship from the P.E. Trudeau Foundation.
Ethics approval and consent to participate
Consent for publication
The authors declare they have no competing interests.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Bullock, H.L., Lavis, J.N., Wilson, M.G. et al. Understanding the implementation of evidence-informed policies and practices from a policy perspective: a critical interpretive synthesis. Implementation Sci 16, 18 (2021). https://doi.org/10.1186/s13012-021-01082-7