- Systematic Review
- Open access
- Published:
A critical synthesis of literature on the promoting action on research implementation in health services (PARIHS) framework
Implementation Science volume 5, Article number: 82 (2010)
Abstract
Background
The Promoting Action on Research Implementation in Health Services framework, or PARIHS, is a conceptual framework that posits key, interacting elements that influence successful implementation of evidence-based practices. It has been widely cited and used as the basis for empirical work; however, there has not yet been a literature review to examine how the framework has been used in implementation projects and research. The purpose of the present article was to critically review and synthesize the literature on PARIHS to understand how it has been used and operationalized, and to highlight its strengths and limitations.
Methods
We conducted a qualitative, critical synthesis of peer-reviewed PARIHS literature published through March 2009. We synthesized findings through a three-step process using semi-structured data abstraction tools and group consensus.
Results
Twenty-four articles met our inclusion criteria: six core concept articles from original PARIHS authors, and eighteen empirical articles ranging from case reports to quantitative studies. Empirical articles generally used PARIHS as an organizing framework for analyses. No studies used PARIHS prospectively to design implementation strategies, and there was generally a lack of detail about how variables were measured or mapped, or how conclusions were derived. Several studies used findings to comment on the framework in ways that could help refine or validate it. The primary issue identified with the framework was a need for greater conceptual clarity regarding the definition of sub-elements and the nature of dynamic relationships. Strengths identified included its flexibility, intuitive appeal, explicit acknowledgement of the outcome of 'successful implementation,' and a more expansive view of what can and should constitute 'evidence.'
Conclusions
While we found studies reporting empirical support for PARIHS, the single greatest need for this and other implementation models is rigorous, prospective use of the framework to guide implementation projects. There is also need to better explain derived findings and how interventions or measures are mapped to specific PARIHS elements; greater conceptual discrimination among sub-elements may be necessary first. In general, it may be time for the implementation science community to develop consensus guidelines for reporting the use and usefulness of theoretical frameworks within implementation studies.
Background
Only a small proportion of research findings are widely translated into clinical settings [1], often due to barriers in the local setting [2]. The Promoting Action on Research Implementation in Health Services framework, or PARIHS, is a conceptual framework that posits key, interacting elements that influence successful implementation of evidence-based practices (EBPs) [3–7]. Implementation researchers have widely cited PARIHS or used it as the basis for empirical work [8–11]. This body of research has occurred against the backdrop of broad calls to incorporate theoretical frameworks in quality improvement implementation activities and research [12–14].
It has been over a decade since Kitson and colleagues first described the PARIHS framework, and while several papers have been published that update and propose refinements [4–7, 14, 15], there has not yet been a literature review to examine how the framework has been used in implementation projects and research. Our interest in PARIHS grew out of its use by numerous researchers involved in the Veterans Health Administration (VA) Quality Enhancement Research Initiative and their expressed need for guidance in how to use it in implementation projects. The purpose of the present article is to critically review and synthesize the conceptual and empirical literatures on PARIHS to: understand how PARIHS has been used; understand how its elements and sub-elements have been operationalized; and highlight strengths and limitations of PARIHS relative to use of the framework to guide an implementation study. We close with a set of recommendations to increase the value of the PARIHS framework for guiding implementation activities and research.
PARIHS framework
PARIHS outlines the determinants of successful implementation of evidence into practice. It was initially published in 1998 as an unnamed framework inductively developed based on the experience of the authors with practice improvement and guideline implementation efforts [3]. They presented three case examples to illustrate its usefulness with accompanying descriptive analyses. Subsequently, two concept analyses were published exploring the maturity, meaning, and characteristics of facilitation [4] and context [5] as they relate to implementation. These concept analyses were based on non-systematic reviews of the literature. The original authors published a refined version of the framework in 2002 based on theoretical insights from these concept analyses [15]. This article contained the first published use of the PARIHS label. A conceptual exploration of evidence was published in 2004, which rounded out the PARIHS team's review of their framework's three core elements [6]. Kitson and colleagues published a further clarification of PARIHS in 2008. This latest paper proposed that PARIHS is best used in a two-step process: as a framework to diagnose and guide preliminary assessment of evidence and context, and to guide development, selection, and assessment of facilitation strategies based on the existing evidence base and local context [7].
The framework comprises three, interacting core elements: evidence (E) - 'codified and non-codified sources of knowledge' [7] as perceived by multiple stakeholders; context (C) - the quality of the environment or setting in which the research is implemented; and facilitation (F) - a 'technique by which one person makes things easier for others,' achieved through 'support to help people change their attitudes, habits, skills, ways of thinking, and working' [3]. The core assertion is that successful implementation is a function of E, C, and F and their interrelationships. The status of each of these elements can be assessed for whether it will have a weak ('low' rating) or strong ('high' rating) effect on implementation (Figure 1).
In the PARIHS framework, evidence consists of four sub-elements, corresponding to four main sources of evidence: research evidence from studies and clinical practice guidelines including, but not limited to, formal experiments; clinical experience or related professional knowledge; patient preferences and experiences; and locally derived information or data, such as project evaluations or quality improvement initiatives [6, 7]. A fundamental premise of PARIHS is that while research evidence is often treated as the most heavily weighted source, all four sources have meaning and constitute evidence from the perspective of end users.
Context comprises four sub-elements: receptive context, organizational culture, leadership, and evaluation [5, 7]. All four of these sub-elements are defined in PARIHS core papers [5, 7], and, for culture, leadership. and evaluation, definitions from the broader literature are cited in a related concept analysis [5]. For example, culture is alternatively described as a 'paradigm,' as '`the way things are done around here' and as a metaphor for the organization-something the organization is rather than something it possesses; leadership is described as an indicator or reflection of the 'nature of human relationships' in the organization, pertaining to the types of leadership roles enacted and who enacts them [3, 5]; and evaluation is described largely in terms of feedback [5] and how performance data are collected and reported [7]. Descriptions of the sub-elements for each are provided in earlier papers that reflect 'high' and 'low' ratings that indicate a more or less favorable context for successful implementation, respectively. Indications for high ratings of context include, for example: clearly defined and acknowledged physical, social, cultural, structural and/or system boundaries; valuing individual staff and clients; promoting organizational learning; existence of transformational leadership as well as democratic or inclusive decision making; and existence of feedback on individual, team, and/or system performance [15, 16].
Facilitation includes three sub-elements and an array of mechanisms to influence implementation of evidence into clinical practice. The first sub-element of facilitation focuses on its purpose; e.g., whether facilitation is to support attainment of a specific goal (task-oriented) or enable individuals or teams to reflect on and change their attitudes and ways of working (holistic-oriented) [15]. In the PARIHS framework, these two purposes are arrayed as endpoints on a continuum. The second and third sub-elements of facilitation are the role of the facilitator(s) and their associated skills and attributes, which are described for each of the two purposes. On the task-oriented end of the continuum, the facilitator might engage in episodic contacts and provide practical focused help, which requires strong project management/technical skills but a relatively low level of intensity. On the holistic-oriented end of facilitation, the facilitator might focus on building sustained partnerships with teams to assist them in developing their own practice change skills. This requires a relatively high level of intensity.
Methods
We used qualitative, critical synthesis methods for this review because our objectives were descriptive (e.g., describing how PARIHS has been used) and critical (e.g., appraising relative strengths and weaknesses of the framework), rather than meta-analytic (e.g., calculating an average effect size) [17]. We describe our review process below.
Search strategy and selection of publications
Our literature search included three sources. First, we conducted key word searches of the PubMed and CINAHL databases using the terms 'PARIHS' and 'promoting action on research implementation in health services.' We selected PubMed because it represents the preeminent database of peer-reviewed literature in the health fields, and CINAHL because it focuses specifically on nursing literature, where some of the original PARIHS concept papers were published. We used limited key words because this review was focused on the PARIHS model, rather than implementation models generally. Second, we reviewed the reference lists of included articles. Third, we solicited citations from a PARIHS author and other colleagues familiar with this body of research.
We selected articles based on four a priori criteria: published peer-reviewed literature, English language, published prior to March 2009, and explicit reference to the PARIHS framework either by name or citation of core conceptual articles. We did not specify a priori exclusion criteria.
Appraisal and abstraction of articles
We appraised and abstracted include articles in a three-step process. First, each article was read by a primary reviewer who wrote a narrative synopsis using a template (see Additional File 1, Synopsis template). The purpose of the initial synopsis was to provide an overall summary and critique of the article. Second, the completed synopsis was distributed and reviewed by all co-authors, and discussed and refined on a conference call. Third, one of the co-authors condensed each synopsis using a structured summary table, with a separate table for each article. The purpose of the summary tables was to create a concise, structured appraisal and critique for each article. Some papers were empirical and others were conceptual. Summary tables for empirical articles included the overall method/design, an appraisal of study quality, study outcomes, how PARIHS was proposed to be used and actually used, and assessment of congruency between PARIHS and study methods (see Additional File 2, Empirical article summary table). These tables also listed how PARIHS elements and sub-elements were defined and measured or operationalized in the study, along with findings, barriers, and enablers to implementation. The summary tables for core concept articles focused on the framework's elements, sub-elements, limitations, recommendations, and other observations (Additional File 3, Core-concept article summary table). These summary tables were reviewed by the primary reviewer for that paper and again by all co-authors, discussed as a group, and affirmed or revised as needed. This collection of empirical and core summary tables constituted the analytic foundation for our meta-summary and synthesis.
Meta-summary and synthesis
Four co-authors reviewed the final set of summary tables and independently highlighted key points per article to create a meta-summary. Key points represented concepts, specific findings related to PARIHS generally and/or to specific elements or sub-elements, observations about the use of the framework, and conclusions. Information highlighted as a key point by at least three of the four co-authors was discussed further at a two-day, in-person working conference. The purpose of the discussion of key points was to explore and summarize similarities and differences across the papers (both empirical and core conceptual) and to develop qualitative themes. Some of the themes were descriptive, e.g., regarding the actual versus articulated use of PARIHS. Other themes were interpretive, e.g., our consensus judgments regarding overall limitations, related issues, and strengths of the framework relative to the ability of researchers to effectively use it to guide an implementation study. We developed implications for using the framework as well as related recommendations based on these synthesized findings. As with the article appraisal, the synthesis and recommendations were discussed with all co-authors and refined until consensus was reached.
Results
Search results
We initially identified 33 unique articles (Figure 2). We excluded an unpublished doctoral dissertation [18], and eight commentaries [19–26]. Commentaries did not reflect planned or actual application or refinement of PARIHS (See Additional File 4, Table of commentaries excluded from the synthesis). We included the remaining 24 articles in our review.
We characterized six articles as core concept articles (Table 1 Overview of core concept articles for the PARIHS framework). These were written by members of a PARIHS coordinating group (http://www.PARIHS.org/pages/contact_us.html) for the stated purpose of introducing [3] or elaborating on the framework, either as a whole [7, 15], or on one of its three core elements [4–6]. The remaining 18 articles (Table 2 Overview of empirical articles included in the synthesis) were a mix of case reports and qualitative or mixed-methods studies [27–33], quantitative studies [9–11, 34–36], literature reviews [37–39] and study protocols [40] or frameworks [41]. We refer to these collectively as empirical articles to distinguish them from the core concept articles.
Two of the empirical articles reported on the same study in which the Context Assessment Instrument (CAI) was developed based on PARIHS [35, 36]. We also obtained an unpublished final report for the project [42], which included all of the material in the two articles plus more methodological detail. We combined these sources into a single entry in Tables 3 and 4, yielding 17 study entries.
How and why PARIHS was used in studies
Empirical studies generally used PARIHS as an organizing framework for analyses, such as examining predictors of nurses' research utilization (RU) [9, 10, 34], or reporting findings, such as highlighting differences between a series of efficacy studies and a planned translational study [40] (Table 2 Overview of empirical articles included in the synthesis).
Stated reasons for using PARIHS included that it acknowledges the complexity of implementation (or knowledge translation) [39]; it includes contextual factors [38]; and that it explicitly includes and describes context and facilitation [30]. Generally, users referred to the intuitive appeal of the three main elements (evidence, context, and facilitation) and PARIHS's explicit acknowledgement of the complex interrelationships among elements and their effects on implementation. Five empirical articles provided no explicit rationale for selecting PARIHS.
How PARIHS elements were operationalized
Three empirical papers described development of survey instruments based on PARIHS, two exclusively on the same survey assessing the element of context [35, 36] and the other on evidence and context [11]. A series of three studies mapped survey items from secondary datasets to PARIHS elements, and tested their association with nurses' RU: one focused on context [34], and two on context and facilitation [9, 10]. Except for a study by the PARIHS team [8], the empirical articles were not designed to validate or refine PARIHS.
Among non-quantitative empirical articles, two provided details of how PARIHS was operationalized: one specified questions used in a program evaluation [33], and another proposed a PARIHS-based framework to enhance reflective professional practice [41]. The nine remaining empirical articles did not specify how elements and sub-elements were measured or assessed, such as coding definitions or logic models for drawing conclusions about observed relationships.
A critical appraisal of reviewed studies
A key strength of the existing PARIHS literature (Table 3 Core concept articles, and Table 4 Empirical articles) was that several studies used findings to comment on the framework in ways that could help refine or validate it. One example was a suggestion to address underlying motivation for change, such as relative advantage and tension for change [27, 28]. Another was a qualitative exploration by the PARIHS team of how the framework fit with empirical findings [29]. A series of three articles attempted to quantify measures of context and facilitation and test quantitative multi-level models using facilitation and context as predictors of RU by nurses [9, 10, 34].
We identified two major issues with the PARIHS literature through our review. First, none of the studies used PARIHS prospectively to design implementation strategies. With the exception of articles reporting on survey development [11, 35, 36], all of the empirical studies were retrospective or cross-sectional. The six core concept papers described analyses that were conducted at a high level addressing broad concepts, and relied on non-systematic review of the literature.
Second, there was significant lack of detail about how variables were measured [39], mapped to PARIHS elements [38], or how results or conclusions were derived [33]. For example, Sharp and colleagues concluded that good implementation outcomes could be achieved in settings with poor context, but not both poor context and poor facilitation. However, the authors did not indicate which cases supported those conclusions and what characterized context and facilitation at those sites [30].
A critical appraisal of the PARIHS framework
Several overarching strengths of PARIHS emerged (Table 3 Core concept articles, and Table 4 Empirical articles). First, though studies have not done so to date, the developers describe an explicit method for using PARIHS to guide diagnostic analysis of evidence and context [7], findings from which should be used to plan facilitation strategies to accomplish implementation.
Second are its flexibility and applicability to a range of settings, as well as perceptions by users that it captures key elements of the implementation experience. This includes PARIHS' expansive acknowledgement of what can and should constitute 'evidence,' and its recognition that implementation is a complex and multi-faceted process that is dynamic and often unpredictable. In additional, several articles reported findings that support specific PARIHS elements or sub-elements, such as Estabrooks and colleagues' finding that measures of facilitation and context are significantly associated with nurses' RU [10].
The primary issue related to the framework was a need for greater conceptual clarity about the definitions of sub-elements and the nature of dynamic relationships among elements and sub-elements. In many cases, sub-elements appear to have significant conceptual overlap. For example, criteria for evaluating receptive context include 'power and authority processes' and whether or not cultural boundaries are clearly defined and acknowledged. These two criteria appear to overlap with the culture and leadership sub-elements, which include being 'able to define culture(s) in terms of prevailing values/beliefs' and 'democratic inclusive decision making processes.' It is not clear what distinguishes receptive context, as a construct, from culture and leadership. Another example is that facilitation is defined solely as a role, and in terms of the individual who fills the role and the relationship they have with those implementing the change. As presently described, this element does not address implementation interventions such as reminders, web-based education, toolkits, social marketing, and audit and feedback that may be undertaken to facilitate implementation, and which could conceivably be untaken by a number of actors. Although PARIHS acknowledges the dynamic relationships among elements, the elements and sub-elements are described in linear terms, from 'low' to 'high,' with little explicit account of how or in what form dynamics among and across the sub-elements might emerge.
Both a strength and issue for PARIHS was the specification of the outcome 'successful implementation.' It was a strength in that the framework stipulates an outcome where many implementation models do not. However, there was little information in the six core articles about how to conceptualize or define successful implementation, and the empirical articles adopted a range of outcomes. Some articles used a broad outcome of RU [10, 39], i.e., the degree to which clinicians apply research knowledge in their practices generally. Others used the degree of implementation or uptake of specific practice changes [30, 31].
Discussion
Our objectives in the present synthesis were to understand how PARIHS has been used in implementation studies, how it has been operationalized, and the strengths and limitations of PARIHS and its supporting literature. We found a reasonably large published literature (33 published papers, 18 of which were empirical), but this is a body of findings that reflects many of the current limitations of the broader implementation science literature. These limitations provide great opportunities for improvement, notably three.
First, PARIHS was largely used and operationalized as an organizing device or heuristic, usually post hoc. However, PARIHS developers intended the framework to be used to assess evidence and context prior to implementation and then using these findings to guide facilitation of implementation. To move the framework forward, we need empirical studies that use PARIHS to prospectively design or comprehensively evaluate implementation activities. Researchers should explain the degree to which intervention design decisions and change strategies are based on PARIHS. The lack of prospective implementation studies is not unique to PARIHS; all but a fraction of published implementation studies fail to explicitly use any theory at all [43, 44], so researchers do not appear to be conducting prospective implementation studies based on any conceptual frameworks; a similar lack of theoretical foundation is reported among studies of organizational factors linked to patient safety [45]. Our findings echo those of Kajermo and colleagues in a recent literature synthesis on use of the BARRIER scale, which is intended to prospectively identify barriers to research use by nurses [46]. Based on a paucity of prospective studies, they concluded that no further descriptive studies should be done, and that only prospective studies would move the science forward. We extend the same call for studies using PARIHS.
Second, though a strength of the empirical literature was that some studies showed empirical support for PARIHS, this finding needs to be interpreted in light of the overall study designs, which were retrospective case reports or cross-sectional analyses, and often lacked key methodological details. Furthermore, authors rarely contrasted findings to previous studies; the citation of prior work using PARIHS occurred almost exclusively in the introduction to set the stage for the study or conceptual rationale of the study. This too, may in part, be a function of the current development of the implementation science literature, and the natural evolution of standards and expectations about what details researchers most need to report. It may be time for something akin to CONSORT [47] or MOOSE [48] guidelines for reporting results of implementation intervention studies or implementation project evaluations. While implementation science may not be amenable to the same manner of checklists that have been applied to randomized trials and meta-analyses, there are key elements that could be described in sufficient specificity to provide guidance to both journal editors and researchers. These might include an explanation or rationale for mapping study findings to the constructs of the conceptual framework being used; a rationale for excluding certain elements; details about operationalization of constructs, including coding definitions for qualitative analyses; and discussion of the criteria authors use to draw conclusions about relationships between determinants and implementation outcomes. This might help address a key criticism of efforts to promote more theory-based implementation research, namely that translation of theory into intervention design is too subjective and opaque [49].
Finally, there are opportunities to improve the conceptual clarity of the framework itself, including refining conceptual definitions to more clearly draw distinctions among related sub-elements, such as receptive context, leadership, and culture. This will help provide for more rigorous studies by making it easier for users to map measures back to PARIHS consistently, derive testable hypotheses using the framework, and design more effective implementation strategies. We have drafted an implementation guide, being published separately, which discusses in more detail recommendations for those using PARIHS in task-oriented implementation projects and research, or seeking to refine the framework. Below, we briefly discuss three specific opportunities to refine the PARIHS framework.
First, PARIHS acknowledges the dynamic relationships among elements and sub-elements in the framework and the often unpredictable nature of implementation. However, dynamic implies that elements/sub-elements interact or act as modifiers or contingencies, such that the effects of one is dependent on others [50]. As a result, the same implementation intervention may have wildly different effects in different settings [51]. PARIHS would be strengthened even more by beginning to describe how those dynamics might emerge and provide examples that could eventually help identify more generalizable patterns. Identifying and describing all potential interactions is clearly impossible, but currently, PARIHS elements are described on a continuum, low to high, that strongly implies linear relationships, which are inconsistent both with the broader concept of PARIHS as a dynamic model and with available evidence. For example, we have prospective studies that find senior leadership support changes dramatically over time, with senior leaders shifting among roles ranging from institutional mentors for the change to critics of it [52]; and that senior leadership support is not always a strong driver and certainly not always a necessary condition for implementation [53, 54]. It may be possible to identify generalizable contextual interactions, such as senior leadership support being necessary for EBPs that involve coordination across departments or services, require large capital investments or lack strong professional endorsement.
In part, the lack of specifics about interactions among elements may arise from PARIHS straddling the line between a higher order planned action (or prescriptive) theory (PAT) for use by change agents to guide their implementation strategy, and a classical (or descriptive/explanatory) model meant to describe or explain how change occurs. The core concept articles explicitly propose that PARIHS be used to guide implementation by assessing evidence and context in order to inform facilitation, strongly positioning PARIHS as a prescriptive model, albeit not with the detail of a PAT as described by Graham and Tetroe [53].
Second, we also noted that a more explicit definition for 'successful implementation' is needed. This again is both a key strength of the framework and an opportunity to strengthen it. A clear definition of successful implementation is critical for moving implementation science literature forward, and we may do well to draw on the literatures of other disciplines. For example, researchers in education [55] and health promotion [56] have written specifically about criteria for determining when new programs are fully implemented. Likewise, scholars in management have written about conceptual considerations for defining effective implementation of new practices such as IT systems [57] and banking practices [58], including distinguishing implementation from 'compliant' use that is either incomplete or likely to degrade.
Conceptually, successful implementation might comprise three distinct aspects, identified as part of our aforementioned implementation Guide. All represent seemingly necessary conditions for concluding that a project has achieved successful implementation: realization of the implementation plan or strategy; achievement and maintenance of the targeted EBP; and achievement and maintenance of end-point patient or organizational outcomes. These three components reflect a logic model linking an implementation strategy to ultimate outcomes. This definition of successful implementation affords an understanding of when and how an implementation program has delivered the benefits as hypothesized. To accomplish that, we need to assess whether the implementation strategy occurred as planned, whether the EBP was established as needed, and whether desired outcomes followed.
Third, other conceptual models should be drawn on and compared to better elaborate the core PARIHS elements or to better position work using PARIHS in the broader literature. The PARIHS core concept papers make it clear that the developers envision PARIHS being used in combination with other conceptual frameworks. Findings in some of the studies suggest the value of making additional attributes of the evidence-based change more explicit such as those identified in Rogers' Diffusion of Innovation framework [34]. For example, Rogers' innovation attribute of the observability of a new practice (i.e., the extent to which its use by an individual is readily perceived by others in their social network) [2, 59] does not appear to have an analogue in PARIHS. These types of comparisons and extensions would help build cumulative knowledge and inform refinements to the framework.
The PARIHS authors continue to revisit and refine the framework, recognize its limitations, and call for further research [7]. We consider a critical strength of any framework. Researchers [60] and practitioners [61] continue to use PARIHS and we expect more rigorous studies will be published. Already in the period since we completed our literature search, we are aware of at least five new publications citing PARIHS including two articles presenting results of validations of survey instruments based on the framework [62, 63]. Also, several prospective research studies based on the framework are in progress by both the PARIHS team (http://www.parihs.org) and other research teams, including one conducting research in Vietnam and several conducting research in the Veterans Health Administration QUERI program within the US.
Limitations
Our review had two limitations. First, we did not assess the 'gray' or unpublished literature or publications in languages other than English. In doing so, we may have missed important work relating to PARIHS.
Second, we focused exclusively on the PARIHS framework, and not on literature regarding other frameworks that may include similar or related constructs. Doing so was beyond the scope of our synthesis, though we do comment on the need for greater comparison and linkages between PARIHS and other frameworks.
Some may also view our methods as limited because we did not conduct a quantitative meta-analysis. However, we used methods appropriate to our research questions and to the literature being reviewed, which included few quantitative studies. We also took several steps to increase the transparency and reliability of our results.
Summary
The single greatest need for researchers using PARIHS, and other implementation models, is to use the framework prospectively and comprehensively, and evaluate that use relative to its perceived strengths and issues for enhancing successful implementation. Ultimately, the proof of any implementation framework is its demonstrated usefulness in practical terms to design implementation interventions and make implementation more effective under various conditions. Studies using the framework in this way will move the whole field forward.
Researchers using PARIHS in studies or to guide action research should clearly explain how PARIHS is used and how interventions or measures map to specific PARIHS elements. For example, studies of facilitation activities should explain how facilitation purpose, role and skills and attributes were defined or taken into account. Other reviews have similarly called for more explicit and detailed explanation of how theory is used in implementation studies [43, 44]. It may be time for the implementation science community to develop consensus guidelines for what should be reported.
References
McGlynn EA, Asch SM, Adams J, Keesey J, Hicks J, DeCristofaro A, Kerr EA: The quality of health care delivered to adults in the United States. N Engl J Med. 2003, 348 (26): 2635-2645. 10.1056/NEJMsa022615.
Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC: Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009, 4: 50-10.1186/1748-5908-4-50.
Kitson A, Harvey G, McCormack B: Enabling the implementation of evidence based practice: a conceptual framework. Quality in Health Care. 1998, 7 (3): 149-158. 10.1136/qshc.7.3.149.
Harvey G, Loftus-Hills A, Rycroft-Malone J, Titchen A, Kitson A, McCormack B, Seers K: Getting evidence into practice: the role and function of facilitation. Journal of Advanced Nursing. 2002, 37 (6): 577-588. 10.1046/j.1365-2648.2002.02126.x.
McCormack B, Kitson A, Harvey G, Rycroft-Malone J, Titchen A, Seers K: Getting evidence into practice: the meaning of 'context'. J Adv Nurs. 2002, 38 (1): 94-104. 10.1046/j.1365-2648.2002.02150.x.
Rycroft-Malone J, Seers K, Titchen A, Harvey G, Kitson A, McCormack B: What counts as evidence in evidence-based practice?. J Adv Nurs. 2004, 47 (1): 81-90. 10.1111/j.1365-2648.2004.03068.x.
Kitson A, Rycroft-Malone J, Harvey G, McCormack B, Seers K, Titchen A: Evaluating the successful implementation of evidence into practice using the PARiHS framework: theoretical and practical challenges. Implementation Science. 2008, 3 (1): 1-10.1186/1748-5908-3-1.
Stetler CB, Legro MW, Rycroft-Malone J, Bowman C, Curran G, Guihan M, Hagedorn H, Pineros S, Wallace CM: Role of "external facilitation" in implementation of research findings: a qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implement Sci. 2006, 1: 23-10.1186/1748-5908-1-23.
Cummings GG, Estabrooks CA, Midodzi WK, Wallin L, Hayduk L: Influence of organizational characteristics and context on research utilization. Nursing research. 2007, 56 (4 Suppl): S24-39. 10.1097/01.NNR.0000280629.63654.95.
Estabrooks CA, Midodzi WK, Cummings GG, Wallin L: Predicting research use in nursing organizations: a multilevel analysis. Nursing research. 2007, 56 (4 Suppl): S7-23. 10.1097/01.NNR.0000280647.18806.98.
Bahtsevani C, Willman A, Khalaf A, Östman M: Developing an instrument for evaluating implementation of clinical practice guidelines: a test-retest study. Journal of Evaluation in Clinical Practice. 2008, 14 (5): 839-846.
Eccles M, Grimshaw J, Walker A, Johnston M, Pitts N: Changing the behavior of healthcare professionals: the use of theory in promoting the uptake of research findings. Journal of Clinical Epidemiology. 2005, 58: 107-112. 10.1016/j.jclinepi.2004.09.002.
Grol RPTM, Bosch MC, Hulscher MEJL, Eccles MP, Wensing M: Planning and Studying Improvement in Patient Care: The Use of Theoretical Perspectives. The Milbank Quarterly. 2007, 85 (1): 93-138. 10.1111/j.1468-0009.2007.00478.x.
ICEBeRG TICEtBR Group: Designing theoretically-informed implementation interventions. Implementation Science. 2006, 1 (1): 4-10.1186/1748-5908-1-4.
Rycroft-Malone J, Kitson A, Harvey G, McCormack B, Seers K, Titchen A, Estabrooks C: Ingredients for change: revisiting a conceptual framework. Qual Saf Health Care. 2002, 11 (2): 174-180. 10.1136/qhc.11.2.174.
Kitson AL, Rycroft-Malone J, Harvey G, McCormack B, Seers K, Titchen A: Evaluating the successful implementation of evidence into practice using the PARiHS framework: theoretical and practical challenges. Implementation Science. 2008, 3: 1-1. 10.1186/1748-5908-3-1.
Sandelowski M, Barroso J: Handbook for synthesizing qualitative research. 2007, New York: Springer Pub. Co
Larkin RM: Challenges to prison-based mental health research: a case study. D.N.Sc. 2008, Columbia University
Donaldson NE, Rutledge DN, Ashley J: Outcomes of adoption: measuring evidence uptake by individuals and organizations. Worldviews on evidence-based nursing/Sigma Theta Tau International, Honor Society of Nursing. 2004, 1 (Suppl 1): S41-51. 10.1111/j.1524-475X.2004.04048.x.
Kavanagh T, Stevens B, Seers K, Sidani S, Watt-Watson J: Examining Appreciative Inquiry as a knowledge translation intervention in pain management. Canadian Journal of Nursing Research. 2008, 40 (2): 40-56.
Kavanagh T, Watt-Watson J, Stevens B: An examination of the factors enabling the successful implementation of evidence-based acute pain practices into pediatric nursing. Children's Health Care. 2007, 36 (3): 303-321.
Larkin ME, Griffith CA, Capasso VA, Cierpial C, Gettings E, Walsh K, O'Malley C: Promoting research utilization using a conceptual framework. J Nurs Adm. 2007, 37 (11): 510-516. 10.1097/01.NNA.0000295617.26980.d1.
O'Halloran P, Martin G, Connolly D: A model for developing, implementing, and evaluating a strategy to improve nursing and midwifery care. Practice Development in Health Care. 2005, 4 (4): 180-191. 10.1002/pdh.20.
Rycroft-Malone J: The PARIHS framework--a framework for guiding the implementation of evidence-based practice. Journal of nursing care quality. 2004, 19 (4): 297-304.
Wallin L, Profetto-McGrath J, Levers MJ: Implementing nursing practice guidelines: a complex undertaking. J Wound Ostomy Continence Nurs. 2005, 32 (5): 294-300. discussion 300-291
Walsh K, Lawless J, Moss C, Allbon C: The development of an engagement tool for practice development. Practice Development in Health Care. 2005, 4: 124-130. 10.1002/pdh.7.
Ellis I, Howard P, Larson A, Robertson J: From workshop to work practice: An exploration of context and facilitation in the development of evidence-based practice. Worldviews on evidence-based nursing/Sigma Theta Tau International, Honor Society of Nursing. 2005, 2 (2): 84-93. 10.1111/j.1741-6787.2005.04088.x.
Owen S, Milburn C: Implementing research findings into practice: improving and developing services for women with serious and enduring mental health problems. J Psychiatr Ment Health Nurs. 2001, 8 (3): 221-231. 10.1046/j.1365-2850.2001.00390.x.
Rycroft-Malone J, Harvey G, Seers K, Kitson A, McCormack B, Titchen A: An exploration of the factors that influence the implementation of evidence into practice. J Clin Nurs. 2004, 13 (8): 913-924. 10.1111/j.1365-2702.2004.01007.x.
Sharp ND, Pineros SL, Hsu C, Starks H, Sales AE: A Qualitative Study to Identify Barriers and Facilitators to Implementation of Pilot Interventions in the Veterans Health Administration (VHA) Northwest Network. Worldviews on evidence-based nursing/Sigma Theta Tau International, Honor Society of Nursing. 2004, 1 (2): 129-139. 10.1111/j.1741-6787.2004.04023.x.
Stetler C, Legro M, Rycroft-Malone J, Bowman C, Curran G, Guihan M, Hagedorn H, Pineros S, Wallace C: Role of "external facilitation" in implementation of research findings: a qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implementation Science. 2006, 1 (1): 23-10.1186/1748-5908-1-23.
Wallin L, Rudberg A, Gunningberg L: Staff experiences in implementing guidelines for Kangaroo Mother Care--a qualitative study. Int J Nurs Stud. 2005, 42 (1): 61-73. 10.1016/j.ijnurstu.2004.05.016.
Conklin J, Stolee P: A model for evaluating knowledge exchange in a network context. The Canadian journal of nursing research = Revue canadienne de recherche en sciences infirmieres. 2008, 40 (2): 116-124.
Wallin L, Estabrooks CA, Midodzi WK, Cummings GG: Development and validation of a derived measure of research utilization by nurses. Nursing research. 2006, 55 (3): 149-160. 10.1097/00006199-200605000-00001.
Wright J: Developing a tool to assess person-centred continence care. Nurs Older People. 2006, 18 (6): 23-28.
Wright J, McCormack B, Coffey A, McCarthy G: Evaluating the context within which continence care is provided in rehabilitation units for older people. INTERNATIONAL JOURNAL OF OLDER PEOPLE NURSING. 2007, 2 (1): 9-19. 10.1111/j.1748-3743.2007.00046.x.
Brown D, McCormack B: Developing Postoperative Pain Management: Utilising the Promoting Action on Research Implementation in Health Services (PARIHS) Framework. Worldviews on Evidence-Based Nursing. 2005, 2 (3): 131-141. 10.1111/j.1741-6787.2005.00024.x.
Meijers JM, Janssen MA, Cummings GG, Wallin L, Estabrooks CA, R YGH: Assessing the relationships between contextual factors and research utilization in nursing: systematic literature review. Journal of advanced nursing. 2006, 55 (5): 622-635. 10.1111/j.1365-2648.2006.03954.x.
Milner M, Estabrooks CA, Myrick F: Research utilization and clinical nurse educators: A systematic review. Journal of evaluation in clinical practice. 2006, 12 (6): 639-655. 10.1111/j.1365-2753.2006.00632.x.
Alkema GE, Frey D: Implications of translating research into practice: a medication management intervention. Home health care services quarterly. 2006, 25 (1-2): 33-54. 10.1300/J027v25n01_03.
Doran DM, Sidani S: Outcomes-focused knowledge translation: a framework for knowledge translation and patient outcomes improvement. Worldviews on evidence-based nursing/Sigma Theta Tau International, Honor Society of Nursing. 2007, 4 (1): 3-13. 10.1111/j.1741-6787.2007.00073.x.
McCormack B, McCarthy G: Development of the Context Assessment Index (CAI). 2008, Republic of Ireland Health Research Board and the Northern Ireland Department of Health, Social Services and Public Safety
Davies P, Walker A, Grimshaw J: A systematic review of the use of theory in the design of guideline dissemination and implementation strategies and interpretation of the results of rigorous evaluations. Implementation Science. 2010, 5 (1): 14-10.1186/1748-5908-5-14.
Davies P, Walker A, Grimshaw J: Theories of behavior change in studies of guideline implementation. Proc Br Psychol Soc. 2003, 11:
Hoff T, Jameson L, Hannan E, Flink E: A review of the literature examining linkages between organizational factors, medical errors, and patient safety. Med Care Res Rev. 2004, 61 (1): 3-37. 10.1177/1077558703257171.
Nilsson Kajermo K, Bostrom AM, Thompson D, Hutchinson A, Estabrooks C, Wallin L: The BARRIERS scale -- the barriers to research utilization scale: A systematic review. Implementation Science. 2010, 5 (1): 32-10.1186/1748-5908-5-32.
Schulz KF, Altman DG, Moher D: CONSORT 2010 Statement: updated guidelines for reporting parallel group randomised trials. BMC Med. 2010, 8: 18-10.1186/1741-7015-8-18.
Stroup DF, Berlin JA, Morton SC, Olkin I, Williamson GD, Rennie D, Moher D, Becker BJ, Sipe TA, Thacker SB: Meta-analysis of observational studies in epidemiology: a proposal for reporting. Meta-analysis Of Observational Studies in Epidemiology (MOOSE) group. Jama. 2000, 283 (15): 2008-2012. 10.1001/jama.283.15.2008.
Bhattacharyya O, Reeves S, Garfinkel S, Zwarenstein M: Designing theoretically-informed implementation interventions: Fine in theory, but evidence of effectiveness in practice is needed. Implementation Science. 2006, 1 (1): 5-10.1186/1748-5908-1-5.
Johns G: The essential impact of context on organizational behavior. Academy of Management Review. 2006, 31 (2): 386-408.
Helfrich CD, Weiner BJ, McKinney MM, Minasian L: Determinants of Implementation Effectiveness: Adapting a Framework for Complex Innovations. Med Care Res Rev. 2007, 64 (3): 279-303. 10.1177/1077558707299887.
Van De Ven AH, Polley DE: The Innovation Journey. 1999, New York: Oxford University Press
Edmondson A: Psychological Safety and Learning Behavior in Work Teams. Administrative Science Quarterly. 1999, 44 (2): 350-383. 10.2307/2666999.
Edmondson A: Speaking up in the operating room: How team leaders promote learning in interdisciplinary action teams. Journal of Management Studies. 2003, 40: 1419-1452. 10.1111/1467-6486.00386.
Yin RK: Changing urban bureaucracies: How new practices become routinized. 1979, Lexington, MA: Lexington Books
Goodman RM, Steckler A: A model for the institutionalization of health promotion programs. Family & Community Health. 1989, 11 (4): 63-78.
Klein KJ, Sorra JS: The challenge of innovation implementation. Academy of Management Review. 1996, 21 (4): 1055-1080. 10.2307/259164.
Nord WR, Tucker S: Implementing Routine and Radical Innovations. 1987, Lexington, Massachusetts: Lexington Books
Rogers EM: Diffusion of Innovations. 2003, New York, NY: The Free Press, Fifth
Rycroft-Malone J, Fontenla M, Seers K, Bick D: Protocol-based care: the standardisation of decision-making?. Journal of Clinical Nursing. 2009, 18 (10): 1490-1500. 10.1111/j.1365-2702.2008.02605.x.
Capasso V, Collins J, Griffith C, Lasala CA, Kilroy S, Martin AT, Pedro J, Wood SL: Outcomes of a clinical nurse specialist-initiated wound care education program: using the promoting action on research implementation in health services framework. Clinical Nurse Specialist: The Journal for Advanced Nursing Practice. 2009, 23 (5): 252-257. 10.1097/NUR.0b013e3181b207f5.
McCormack B, McCarthy G, Wright J, Coffey A: Development and Testing of the Context Assessment Index (CAI). Worldviews on Evidence-Based Nursing. 2009, 6 (1): 27-35. 10.1111/j.1741-6787.2008.00130.x.
Helfrich CD, Li YF, Sharp ND, Sales AE: Organizational readiness to change assessment (ORCA): Development of an instrument based on the Promoting Action on Research in Health Services (PARiHS) framework. Implementation Science. 2009, 4 (1): 38-10.1186/1748-5908-4-38.
Acknowledgements
This material is based upon work supported by the U.S. Department of Veterans Affairs, Office of Research and Development Health Services R&D Program. We wish to acknowledge the important contributions of Jeffrey Smith to the paper, and the important administrative assistance of Jared LeClerc and Rachel Smith. Also, our thanks to Corrine Voils for providing invaluable feedback on a draft of the paper, and to Lars Wallin and Jacqueline Tetroe for their excellent reviews and suggestions. The views expressed in this article are the authors' and do not necessarily reflect the position or policy of the Department of Veterans Affairs.
Author information
Authors and Affiliations
Corresponding author
Additional information
Competing interests
The authors declare that they have no competing interests.
Authors' contributions
CBS conceived the study. All authors abstracted, reviewed data and provided critical input on findings. CDH wrote first draft of paper and CBS, LJD and HH provided major input and revisions. All authors read, critiqued and approved the final manuscript.
Electronic supplementary material
13012_2010_296_MOESM1_ESM.DOC
Additional file 1: Synopsis template. The synopsis template is a semi-structured form for initial narrative abstraction and critique of the included articles. It included the article abstract and six sections to be filled out by the reviewer, such as aspects of the PARIHS framework said to influence the study. (DOC 37 KB)
13012_2010_296_MOESM2_ESM.DOC
Additional file 2: Summary table template for empirical articles. The summary table template is a semi-structured tool for article abstraction and critique that was in tabular format and included more discrete data elements than the synopsis template, e.g., broken down by PARIHS element and sub-element. The summary table differed between the core-concept and empirical articles because of the types of publication (e.g., differences in the purposes and methods of the papers). This is the summary table for the empirical articles. (DOC 80 KB)
13012_2010_296_MOESM3_ESM.DOC
Additional file 3: Summary table template for core concept articles. The summary table template is a semi-structured tool for article abstraction and critique that was in tabular format and included more discrete data elements than the synopsis template, e.g., broken down by PARIHS element and sub-element synthesis. The summary table differed between the core-concept and empirical articles because of the types of publication and related content (e.g., differences in the purposes and methods of the papers). This is the summary table for the core concept articles. (DOC 79 KB)
13012_2010_296_MOESM4_ESM.DOC
Additional file 4: Commentaries excluded from the synthesis. This is a table of eight papers that were reviewed as part of our literature review and ultimately excluded because we defined them as commentaries that neither presented empirical research related to PARIHS nor conceptual critique or elaboration of the framework. The table includes abstracted data on the purpose of paper; the rationale for using PARIHS; and how PARIHS was to be used. (DOC 40 KB)
Authors’ original submitted files for images
Below are the links to the authors’ original submitted files for images.
Rights and permissions
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
About this article
Cite this article
Helfrich, C.D., Damschroder, L.J., Hagedorn, H.J. et al. A critical synthesis of literature on the promoting action on research implementation in health services (PARIHS) framework. Implementation Sci 5, 82 (2010). https://doi.org/10.1186/1748-5908-5-82
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/1748-5908-5-82