- Short report
- Open access
- Published:
Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study
Implementation Science volume 10, Article number: 109 (2015)
Abstract
Background
Poor terminological consistency for core concepts in implementation science has been widely noted as an obstacle to effective meta-analyses. This inconsistency is also a barrier for those seeking guidance from the research literature when developing and planning implementation initiatives. The Expert Recommendations for Implementing Change (ERIC) study aims to address one area of terminological inconsistency: discrete implementation strategies involving one process or action used to support a practice change. The present report is on the second stage of the ERIC project that focuses on providing initial validation of the compilation of 73 implementation strategies that were identified in the first phase.
Findings
Purposive sampling was used to recruit a panel of experts in implementation science and clinical practice (N = 35). These key stakeholders used concept mapping sorting and rating activities to place the 73 implementation strategies into similar groups and to rate each strategy’s relative importance and feasibility. Multidimensional scaling analysis provided a quantitative representation of the relationships among the strategies, all but one of which were found to be conceptually distinct from the others. Hierarchical cluster analysis supported organizing the 73 strategies into 9 categories. The ratings data reflect those strategies identified as the most important and feasible.
Conclusions
This study provides initial validation of the implementation strategies within the ERIC compilation as being conceptually distinct. The categorization and strategy ratings of importance and feasibility may facilitate the search for, and selection of, strategies that are best suited for implementation efforts in a particular setting.
Background
Concerns about poor consistency in implementation science (IS) terminology have led researchers to characterize the field as a “Tower of Babel” [1]. Inconsistent terminology complicates literature searches, and researchers have found that search strategy yield and precision indices for implementation and quality improvement studies are moderate at best [2, 3]. This limits meta-analytic and replication efforts aimed at rigorously evaluating the effectiveness of implementation strategies and the value of existing literature for those enacting implementation initiatives. The science and practice of implementation would be greatly facilitated by a parsimonious nomenclature of conceptually distinct implementation strategies [2, 4–7].
Recently, Powell et al. [8] reviewed the health and mental health literature (including 41 compilations and reviews) and proposed a compilation of 68 discrete implementation strategies involving one action or process. This compilation served as the starting point for a subsequent multi-stage project called Expert Recommendations for Implementing Change (ERIC) [9]. The ERIC project’s first stage involved expert panelists (N = 71) using a modified Delphi process to revise the compilation, which resulted in an updated compilation of 73 discrete implementation strategies [10].
The aim of the ERIC project’s second stage, presented here, was to obtain preliminary validation of the compilation of 73 implementation strategies by studying the relationships between the strategies and obtaining relative importance and feasibility ratings for each strategy. The study of the relationships among the strategies supports the evaluation of whether the strategies are conceptually distinct from one another as well as how the strategies can be organized into conceptually relevant groupings. The former can also serve the practical purpose of making it easier for stakeholders to consider the range of implementation strategies by thematic cluster. The importance and feasibility ratings for the strategies provide insight into the perceived applicability of the strategies. It is of general interest which strategies have relatively high and low ratings by experts.
Method
A purposive sampling procedure was used to recruit an expert panel of implementation science and clinical experts (N = 35) to participate in concept mapping and rating tasks [9, 10]. A detailed description of procedures has been published [9], and a summary is provided here. Concept mapping is a mixed-method procedure for engaging stakeholder groups in a structured conceptualization process [11]. This process supports visually representing the relationships among a set of related concepts and empirically clustering them into conceptually distinct categories and rating them on multiple dimensions.
The Concept Systems Global MAX™ [12] web platform was used for the panel’s sorting and rating tasks and data analysis. A more detailed introduction to concept mapping can be found in Trochim and Kane [13]. For the sorting task, participants were asked to sort virtual cards for each of the 73 strategies, accompanied by their definitions, into piles as they deemed appropriate. Participants were asked to rate each strategy for importance and feasibility ranging from 1 (relatively unimportant/not at all feasible) to 5 (extremely important/extremely feasible). These global ratings were prefaced by the following instructions: “Please select a number from 1 to 5 for each discrete implementation strategy to provide a rating in terms of how important (feasible) you think it is. Keep in mind that we are looking for relative importance (feasibility), use all the values in the rating scale to make distinctions.” Participants were able to select which set of activities they wanted to do first and were also able to work on the sorting and rating activities over multiple online sessions, at their convenience, before submitting their responses.
Multidimensional scaling and hierarchical cluster analyses were conducted to produce visual representations of the relationships among the strategies. Descriptive statistics for the importance and feasibility ratings were calculated. Each strategy’s importance and feasibility score was plotted on a graph. The resulting scatterplot was divided into four quadrants or “Go-zones” (e.g., I, II, III, IV) using the mean of each dimension. For example, quadrant I contains strategies that have values above the means for both dimensions. The Go-zone quadrants column in Table 1 reflects the combined relative importance and feasibility for each strategy.
Results
Experts who participated in the concept mapping and rating tasks and were affiliated with academic or healthcare institutions in the United States (n = 34) or in Canada (n = 1). Thirty-two of the 35 experts provided valid sorts (>75 % of strategies sorted), and 30 provided importance and feasibility ratings for all strategies. Sixty-three percent of participants had exclusive expertise in IS, 29 % were experts in both IS and clinical practice, and 8 % indicated clinical practice expertise only. Sixty-nine percent of participants had some affiliation with the US Department of Veterans Affairs (VA), most of whom also held academic appointments in social science or health-related schools or departments.
Figure 1 presents a point map that visually represents the relationships among the 73 implementation strategies, with each point on the map representing a strategy. The strategies are numbered to aid in cross-referencing the spatial relationships of the points on the map with their labels enumerated in Table 1. All but two strategies were sorted as being conceptually distinct. Strategies #66 (Use capitated payments) and #70 (Use other payment schemes) were always sorted together. Two other strategies were proximal to one another though they were sorted together by only 4 of 32 panelists (#35 Identify and prepare champions and #57 Recruit, designate, and train for leadership), indicating that they are more similar in how they relate with other strategies on the map, than they are directly similar to one another.
The final clusters were developed over 3 weeks of deliberations by the ERIC investigative team. A 13 cluster starting point was selected because it is one standard deviation above the mean number of clusters typically obtained in concept mapping [14]. In this study, 69 % of respondents sorted statements into 13 or fewer piles. We sequentially reviewed cluster merges and achieved consensus to merge clusters down to nine conceptually distinct clusters. For example, two clusters shown in pale green at the center bottom in Fig. 1 (separated by dashed lines) were merged to form a single cluster labeled Develop stakeholder interrelationships, as the original clusters were judged as not sufficiently conceptually distinct.
When the team reviewed the clusters for conceptual clarity, three proposals came forward to move individual strategies to neighboring clusters. First, #62 (Start a dissemination organization) was moved from the Engage consumers cluster to the Change infrastructure cluster, as it was judged more similar to infrastructure support for a practice change than engaging consumers. Second, #48 (Organize clinician implementation team meetings) was moved to the Develop stakeholder interrelationships cluster from Adapt and tailor to the context, as the former has greater interpersonal focus than the latter. And finally, #58 (Remind clinicians) was moved to the Support clinicians cluster from Provide interactive assistance because it is more administrative than interactive in focus. Unanimous consensus was reached for the final cluster arrangements. Additional file 1 provides a cluster-by-cluster visual tour of the concept map.
A multi-step process was used to determine labels for the final clusters. The list began with labels provided by expert panel members for their clusters that were most similar to the final cluster solutions. This list was supplemented with highly descriptive labels identified from the investigative team’s meeting minutes from cluster solution deliberations. Proposed criteria for developing cluster labels (Table 2) were introduced for team comment by one of the authors (LJD) along with suggested label revisions. These criteria were helpful in structuring iterative discussion among team members, the result of which was voted upon by the team and unanimously adopted.
Table 1 presents a summary of the 73 implementation strategies, organized by cluster with mean importance and feasibility ratings. There was a strong relationship (r = 0.7) between the feasibility and importance ratings, meaning that most strategies fell within either quadrant I (high importance and feasibility) or III (low importance/feasibility). However, there were still a number of strategies that were viewed as important but not as feasible (12 %, e.g., Access new funding), or feasible but less important (15 %, e.g., Remind clinicians). Clusters of strategies that are more immediate and concrete and are potentially more in the control of those tasked with supporting change (e.g., Use evaluative and iterative strategies, Train and educate stakeholders) tended to have higher importance and feasibility ratings. Clusters that are more strategic, but also potentially involve changing well-established systems (e.g., Change infrastructure, Utilize financial strategies), tended to have lower ratings. Figure 2 presents a graphic of the Go-zone data.
Discussion
Results from this study provide initial validation for viewing the 73 implementation strategies as conceptually distinct. Cluster analyses of the concept mapping data support grouping strategies into nine clusters which have practical heuristic value for those looking to the ERIC compilation of implementation strategies for guidance. The importance and feasibility ratings for the strategies supported the formation of Go-zone quadrants that can be used to help decision makers prioritize which strategies to use when planning an implementation initiative.
While the concept mapping strategy used in this study represents a strong methodological approach to evaluating whether the 73 implementation strategies are conceptually distinct and organizing them by theme and potential applicability (i.e., Go-zone analysis), there are notable limitations. Recruitment had been restricted to the time zones within the continental United States to minimize scheduling conflicts for elements of the ERIC project that required real time interactions among participants. Thus, all but one of the 35 participants were from the United States, and 69 % had some affiliation with the VA. While concept maps with 30 or more participants are considered to be highly reliable [14], if stakeholders from outside the United States had practice contexts that alter the perceptions of these strategies interrelationships, or the ratings of their perceived importance and feasibility, different results may be obtained.
Abbreviations
- ERIC:
-
Expert Recommendations for Implementing Change
- VA:
-
US Department of Veterans Affairs
References
McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, et al. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: a Tower of Babel? Implement Sci. 2010;5:1–11.
Hempel S, Rubenstein LV, Shanman RM, Foy R, Golder S, Danz M, et al. Identifying quality improvement intervention publications: a comparison of electronic search strategies. Implement Sci. 2011;6:1–10.
McKibbon KA, Lokker C, Wilczynski NL, Haynes RB, Ciliska D, Dobbins M, et al. Search filters can find some but not all knowledge translation articles in MEDLINE: an analytic survey. J Clin Epidemiol. 2012;65:651–9.
Michie S, Fixsen DL, Grimshaw JM, Eccles MP. Specifying and reporting complex behaviour change interventions: the need for a scientific method. Implement Sci. 2009;4:1–6.
Mazza D, Bairstow P, Buchan H, Chakraborty SP, Van Hecke O, Grech C, et al. Refining a taxonomy for guideline implementation: results of an exercise in abstract classification. Implement Sci. 2013;8:1–10.
Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8:1–11.
Colquhoun H, Leeman J, Michie S, Lokker C, Bragge P, Hempel S, et al. Towards a common terminology: a simplified framework of interventions to promote and integrate evidence into health practices, systems, and policies. Implement Sci. 2014;9:1–6.
Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, et al. A compilation of strategies for implementing clinical innovations in health and mental health. Med Care Res Rev. 2012;69:123–57.
Waltz TJ, Powell BJ, Chinman MJ, Smith JL, Matthieu MM, Proctor EK, et al. Expert recommendations for implementing change (ERIC): protocol for a mixed methods study. Implement Sci. 2014;9:1–12.
Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:1–14.
Kane M, Trochim WMK. Concept mapping for planning and evaluation. Thousand Oaks, CA: Sage; 2007.
Concept Systems Global Max© [http://www.conceptsystems.com/content/view/the-concept-system.html]
Trochim WMK, Kane M. Concept mapping: an introduction to structured conceptualization in health care. Int J Qual Health Care. 2005;17:187–91.
Rosas SR, Kane M. Quality and rigor of the concept mapping methodology: a pooled study analysis. Eval Program Plann. 2012;35:236–45.
Acknowledgements
This project is funded through the US Department of Veterans Affairs Veterans Health Administration Mental Health Quality Enhancement Research Initiative (QLP 55–025). The views expressed in this article are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs or the US government. Additionally, TJW received support from the VA Office of Academic Affiliations Advanced Fellowships Program in Health Services Research and Development at the Center for Mental Healthcare & Outcomes Research, and BJP received support from the National Institute of Mental Health (F31 MH098478), the Doris Duke Charitable Foundation (Fellowship for the Promotion of Child Well-Being), and the Fahs-Beck Fund for Research and Experimentation. MJC received support from the VISN 4 Mental Illness Research, Education, and Clinical Center. MJC received support from the VISN 4 Mental Illness Research, Education, and Clinical Center at the VA Pittsburgh Healthcare System.
We would like to acknowledge the contributions of each member of the expert panel: Greg Aarons, University of California, San Diego; Mark Bauer, Harvard University and US Department of Veterans Affairs; Laurel Copeland, Scott and White Healthcare and US Department of Veterans Affairs; Torrey Creed, University of Pennsylvania; Laura Damschroder, University of Michigan and US Department of Veterans Affairs; Rani Elwy, Boston University and US Department of Veterans Affairs; Bradford Felker, University of Washington and US Department of Veterans Affairs; Hildi Hagedorn, University of Minnesota and US Department of Veterans Affairs; Alison Hamilton, University of California, Los Angeles and US Department of Veterans Affairs; Kathy Henderson, US Department of Veterans Affairs; Timothy Hogan, University of Massachusetts and US Department of Veterans Affairs; Kelly Koerner, Evidence-Based Practice Institute; Sarah Krein, University of Michigan and US Department of Veterans Affairs; Julie Kreyenbuhl, University of Maryland and US Department of Veterans Affairs; Sara Landes, University of Washington and US Department of Veterans Affairs; Cara Lewis, Indiana University-Bloomington; Julie Lowery, University of Michigan and US Department of Veterans Affairs; Aaron Lyon, University of Washington; Natalie Maples, University of Texas Health Science Center San Antonio; Alan McGuire, Indiana University-Purdue University Indianapolis and US Department of Veterans Affairs; Amanda Midboe, US Department of Veterans Affairs; Laura Murray, Johns Hopkins University; Princess Osei-Bonsu, US Department of Veterans Affairs; Richard Owen, University of Arkansas for Medical Sciences and US Department of Veterans Affairs; Louise Parker, University of Massachusetts Boston; Enola Proctor, Washington University in St. Louis; Mona Ritchie, US Department of Veterans Affairs; Angie Rollins, US Department of Veterans Affairs and Indiana University Purdue University Indianapolis; Anju Sahay, US Department of Veterans Affairs; Anne Sales, University of Michigan and US Department of Veterans Affairs; Sharon Straus, University of Toronto; Carol VanDeusen Lukas, Boston University and US Department of Veterans Affairs; Dawn Velligan, University of Texas Health Science Center San Antonio; Kara Zivin, University of Michigan and US Department of Veterans Affairs; and Cynthia Zubritsky, University of Pennsylvania.
Author information
Authors and Affiliations
Corresponding author
Additional information
Competing interests
The authors declare that they have no competing interests.
Authors’ contributions
All authors were involved in the conceptualization and design of this study. TJW and JEK are co-principal investigators of the ERIC project. JLS, MMM, MJC, and LJD are co-investigators. BJP and EKP are consultants. TJW led the data collection. TJW and BJP conducted the data analysis and drafted this manuscript. All authors read and approved the final manuscript.
Additional file
Additional file 1:
A cluster-by-cluster visual tour of the concept map.
Rights and permissions
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
About this article
Cite this article
Waltz, T.J., Powell, B.J., Matthieu, M.M. et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implementation Sci 10, 109 (2015). https://doi.org/10.1186/s13012-015-0295-0
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s13012-015-0295-0