Skip to main content

Promises and pitfalls in implementation science from the perspective of US-based researchers: learning from a pre-mortem

Abstract

Background

Implementation science is at a sufficiently advanced stage that it is appropriate for the field to reflect on progress thus far in achieving its vision, with a goal of charting a path forward. In this debate, we offer such reflections and report on potential threats that might stymie progress, as well as opportunities to enhance the success and impact of the field, from the perspective of a group of US-based researchers.

Main body

Ten mid-career extramurally funded US-based researchers completed a “pre-mortem” or a group brainstorming exercise that leverages prospective hindsight to imagine that an event has already occurred and to generate an explanation for it — to reduce the likelihood of a poor outcome. We came to consensus on six key themes related to threats and opportunities for the field: (1) insufficient impact, (2) too much emphasis on being a “legitimate science,” (3) re-creation of the evidence-to-practice gap, (4) difficulty balancing accessibility and field coherence, (5) inability to align timelines and priorities with partners, and (6) overly complex implementation strategies and approaches.

Conclusion

We submit this debate piece to generate further discussion with other implementation partners as our field continues to develop and evolve. We hope the key opportunities identified will enhance the future of implementation research in the USA and spark discussion across international groups. We will continue to learn with humility about how best to implement with the goal of achieving equitable population health impact at scale.

Peer Review reports

Background

Research on dissemination and implementation has a long, rich history [1]. We are grateful to be a part of that history as some of the first US researchers to build implementation science careers as the field was formalizing [1]. Our backgrounds are in psychology, public health, social work, education, and medicine with foundations in intervention science, clinical science, community psychology, and health services research. Most of us have clinical training and experience. We came to implementation science frustrated because patients and community members did not routinely receive evidence-based practices (EBPs) and because policies were not aligned with high-quality research evidence. We became aware that resources spent developing EBPs were not translating into their routine delivery outside of research contexts, and recognized that racially, ethnically, and socioeconomically diverse communities, populations, and settings that would benefit most from EBPs were not equitably reached. Implementation science attracted us as a way towards equitably changing healthcare quality, systems, and outcomes [2]—that is, achieving population health impact and social justice at scale [3].

Implementation science has reached an appropriate time developmentally to reflect on its progress. In 2006, the flagship journal Implementation Science was launched. The first National Institutes of Health (NIH) implementation science conference was held in 2007. The past 15 years have seen thousands of articles; funding mechanisms, including from the NIH, the UK Medical Research Council, and the Canadian Institutes of Health Research; international meetings; additional journals; and a growing global cadre of implementation scientists.

In recent years, there have been several self-critical assessments of the field, made by leading implementation scientists [4,5,6,7,8,9]. These self-critical assessments are welcomed because they challenge commonly held assumptions and present opportunities to move the field forward. In reviewing the literature, we would like to see even more of these discussions given recent concerns that the field may be stagnating [4, 6]. First, the majority of these self-critical assessments have been led by teams outside of the US, including Europe [4], Australia [5], and Canada [6]. In this commentary, we offer a US-based perspective on the opportunities for the field to continue to have forward momentum. Second, many of these assessments are based upon retrospective reflections from leading implementation scientists. To come to consensus on the themes that we shared in this commentary, we used the innovative pre-mortem technique. A pre-mortem uses prospective hindsight—a group imagines a failure and generates an explanation for it—to reduce the likelihood of the failure [10] which allows for leveraging prospective hindsight to prospectively generate potential threats to the field rather than the retrospective approach of a post-mortem (see Additional file 1 for more details on the approach). Third, while two of the self-critical assessments from Europe [4] and Australia [5] offer similar perspectives, particularly around partner engagement and the importance of embedded research infrastructures and capacity building, our commentary builds upon these assessments and presents a set of threats and opportunities not fully articulated in the previous pieces, particularly not from the perspective of what the potential outcome might be if the field cannot address existing threats. Other self-critical assessments are focused on specific issues within the field such as new directions in audit and feedback [6], how implementation science might be relevant in COVID-19 [7], the cultural politics of the field [9], and personal reflections of one senior author [8]. Finally, our commentary offers the perspective that implementation science as a discipline is not immune from the critiques of other sciences which is not explicitly stated in previous self-critical assessments. We hope this commentary will inspire dialogue, new solutions and methods, and innovative partnerships across international teams and highlight the only partially tapped utility of implementation science for improving population health equitably.

Main body

The six themes we discuss below which were identified through the pre-mortem (Table 1) are also challenges in the fields in which we initially trained. These themes threaten forward movement if we are not thoughtful about the field’s evolution and growth. Framing these themes using prospective hindsight highlights their complexities and points toward potential opportunities.

Table 1 Threats that might stymie progress in the field of implementation science: Themes, causes, and solutions

Theme 1: We did not impact population health or health equity

Threats

Impact is foundational to implementation science. We considered impact from the equity perspective of deploying discoveries that are relevant, appropriate, and feasible across diverse populations and settings for widespread health and societal benefits, while acknowledging the complexity of defining impact [12]. The literature has only a few examples of the field having broad impact (e.g., implementation of patient safety checklists) [3]. This scarcity of success may be due to many implementation studies having null results, implementation efforts taking many years to influence public health, or a misalignment between reporting impact broadly and metrics such as papers and grants used to evaluate researchers and the quality of their research [13]. Regardless, as the field coalesces and grows, funding, uptake, and scaling of implementation approaches require that they demonstrate societal and population health impact and economic value. Below, we outline tensions we can address to demonstrate impact as the field continues to develop and demonstrate its utility [14, 15].

Our mission to improve EBP implementation is more complex than instituting a discrete strategy [16]. The field’s relatively focused endeavor to improve the widespread, routine adoption, implementation, and sustainment of EBPs has therefore evolved to be more all-encompassing. This is partly attributed to findings that organizational factors such as culture predict much of the ability of health service organizations to provide high-quality care and implement EBPs [17, 18] and that policy significantly shapes health and inequities, partially through financing and incentives for changing the healthcare status quo [19]. Additionally, as part of context, upstream societal and structural factors such as structural racism and social determinants of health are recognized as critical for shaping health inequities and inequitable implementation [20]. Only recently, however, has the field more explicitly included and measured these determinants and their impact on implementation and health outcomes [20,21,22,23]. Given the important role of multilevel context in implementation, understanding the real-world complexity and interconnected nature of these determinants is critical. Yet inclusion of these complexities in our models and solutions takes more time and resources than was originally thought for a field whose mission is to hasten the deployment of science to practice.

Opportunities

As implementation researchers, our publications ought to detail impact, both with empirical evidence about health outcomes (including whether outcomes were equitably improved in all groups) in our studies and impact at organizational or policy levels resulting from research and partnerships (e.g., if results led to state funding for EBP delivery or partners report that implementation challenges were addressed). Measuring health outcomes is often challenging when study resources are allocated to rigorous evaluation of implementation strategies and outcomes, but may offer the greatest opportunity to demonstrate impact. Increasingly, we need to leverage routinely collected heath outcome or administrative data and other pragmatic measures [24].

Another potential solution to increase impact is better defining an implementation strategy’s scope. Some focus on proximal implementation and clinical outcomes and should acknowledge an inability to meaningfully impact system-level outcomes; others are designed for system-level effects and should state limitations for individual impact. This suggestion stems from our experience studying individual clinician behavior to state and national policies, and realization that balancing breadth and depth is important for the future of implementation. It also underscores the importance of being explicit about how and why an implementation strategy is intended to work (i.e., specifying hypothesized mechanisms) [16, 25, 26].

Because of the need to consider context, multilevel system variation, and other complexities while accelerating the implementation of EBPs in communities, team science is essential [27] for equitable impact. Examples include applying implementation science to examine and address social and structural determinants (e.g., structural racism) as part of contextual assessments to advance understanding of barriers to implementation or informing selection or refinement/adaptation of EBPs and/or implementation strategies [20, 28]. This work, in collaboration with community members and leaders, intervention developers, prevention scientists, policymakers, and other scientific and practitioner partners, can provide a foundation and strategies for shared responses to inequities or uneven EBP implementation informed by implementation and policy development focused on and prioritizing health equity [29, 30]. Implementation scientists can also prioritize EBPs and strategies with potential to promote health equity to highlight the value and impact of the field and avoid inadvertently reinforcing inequities. We can measure and track equitable delivery of EBPs and implementation strategies across populations and settings and the extent that approaches alter health inequities [20, 21].

Areas in which we ought to generate more evidence to demonstrate impact include (a) investigating the relationship between our implementation outcomes and health outcomes [31] and prioritizing both sets of variables such as suggested by hybrid designs [16]; (b) demonstrating improvement in population health, including in promoting health equity and reducing health inequities [22]; and (c) demonstrating the economic impact of EBP implementation and of poor/ineffective implementation [14] (i.e., return on investment and value). Demonstrating the economic costs of effective strategies is critical [14, 16, 32, 33]. Without compelling evidence that implementation science-informed approaches yield a favorable return, policymakers and administrators may be reluctant to invest time and resources in complex approaches. Identifying the best approach to economic analysis and ensuring collection of this data during implementation efforts is critical to building a business case for funding implementation.

Finally, and perhaps most importantly, translating our scientific knowledge into usable knowledge for the public is a way forward for impact. This can be accomplished through multiple avenues. The recently published National Cancer Institute practitioner guide to implementation science [34] is one example of a product that can translate research to practice. We recommend that implementation scientists also clearly communicate the value of the field to the public and policymakers. The COVID-19 pandemic underscores the value of an implementation science-informed approach: An influential simulation paper prior to emergency-use approval for COVID-19 vaccines suggested that implementation rather than vaccine effectiveness would be the major challenge to global population vaccination [35], precisely predicting the vaccine rollout challenges. If more people in the public and in healthcare knew what implementation science offers, we could have more impact and added value to public needs. As implementation scientists, our responsibility is to shape the narrative about the value of our field [36]. This includes communicating our work in understandable ways and answering the key questions that policymakers, the public, and our broader communities have for us in lay venues including op-eds [37, 38].

Theme 2: We over anchored on becoming a “legitimate” science

Threats

The past 15 years have seen a flurry of activity around codifying and legitimizing the science of implementation. This pattern is consistent with the emergence of a new field with no common body of facts and scientists converging on conceptual frameworks, terminology, methods, and designs to answer research questions [39]. A shared lexicon and tools are laudable goals and can legitimize implementation science, but potentially undermine the future of the field if not approached thoughtfully.

First, we observe a tendency in the field to reify commonly used frameworks, approaches, and ways of thinking. Using similar terminology has clear communication advantages, but we see a disadvantage to all studies applying the same conceptual frameworks, designs, and methods without critical thinking, which can contribute to stagnancy and limit innovation. For example, while Proctor and colleagues’ influential 2011 paper substantially advanced the field by defining implementation outcomes [40], scholars rarely posit outcomes beyond this initial set. A few of the outcomes are over-represented (e.g., fidelity) compared to others.

A second example is the idea that implementation science-related inquiries require an EBP rather than simply an existing innovation or program that meets a community’s need [41]. The COVID-19 pandemic demonstrated how quickly implementation science might become obsolete if we only get involved when there is an EBP [42, 43]. Furthermore, approaches that over-prioritize scientific evidence over community-defined evidence can disempower community partners [41, 44]. This might manifest as EBPs that do not reflect or involve populations that experience historical or ongoing mistreatment, discrimination, or injustices from public health and/or medical institutions, presenting foundational challenges in our ability to equitably reach, implement, and sustain EBPs [21].

A third challenge is related to our borrowing from disciplines such as organizational theory, behavioral science, and systems science. One danger, since funders and reviewers prioritize novelty [45], is borrowing from other fields to maximize innovation but doing so in a superficial manner that does not reap the benefits of deep interdisciplinary or transdisciplinary work.

Opportunities

Healthy critiques, reflection, and dismantling of current thinking are needed for scientific field development. We have opportunities to innovate in our methodologies and theories before settling on what is “widely accepted” [46, 47]. Although we have 150 published implementation frameworks [48] and must carefully consider the value of adding more, frameworks are still opportunities to shift paradigms and advance theory. Deeper application and evolution of methods from other adjacent fields applied to implementation are opportunities to harness well-vetted theory, advance our science, and increase rigor and impact, particularly in promoting health equity. For example, we have seen recent innovations in adapting existing theories, models, and frameworks to focus more on equity (e.g., see [23, 49,50,51]). We note opportunities to learn from and integrate theories and frameworks from fields with a long history of health equity scholarship, including anthropology, sociology, and public health [52]. Simultaneously, we cannot overpromise the benefits of implementation science: We will quickly become disillusioned if we are not circumspect about the potential benefit — or lack thereof — of the products of our implementation work.

Theme 3: We recreated the research-to-practice gap

Threats

Although implementation science was created to reduce the research-to-practice gap, recent critiques suggest we may be recreating it [53]. This could undermine the forward movement of the field [5], including work to reach populations experiencing health inequities [22]. More bidirectional partnership between implementation research and practice is needed [54].

Because implementation science requires multilevel and partnered approaches (theme 1), it is complex by nature. Input from multiple sources that often prioritizes researcher perspectives may lead implementation strategies to be developed without “designing for implementation.” In other words, many strategies are designed for maximal theoretical effect, with the unintended consequence of limiting fit, feasibility and/or affordability [55]. Additionally, because implementation science is relatively new, investigators may feel pressure to develop their own approach to push the field forward, especially given the premium that funders and reviewers place on innovation. The resulting innovation may be less responsive to partners and the result too complex or incompatible with many practice settings. There may be limited access to the implementation strategy due to limited capacity to train others in complex, “proprietary” strategies. As we have been advocating for intervention developers to design for implementation for years [56], we might consider heeding our own advice.

Second, the state of implementation frameworks is challenging because of both their number and their utility for pragmatic application. The multitude of implementation frameworks [48] creates considerable difficulty for researchers and community partners in selecting a framework to guide their work and pragmatically apply the findings.

Third, a key tension that we hear from partners is that implementation science should balance adaptation for context with generalizable knowledge. While context is key [57, 58], tailoring solutions for particular sites or efforts may not always be possible with limited resources. We ought to balance pragmatism, the creation of generalizable knowledge, and finite resources.

Opportunities

To avoid recreating the research-to-practice gap, we should balance advancing implementation science theory and general knowledge with serving research and community partners, all with finite resources. Solutions may include refining commonly used frameworks to enhance pragmatism and facilitate application. An example is the sixth domain added to the Consolidated Framework for Research (CFIR) focused on patient needs [59] and adaptation of CFIR for the context of low- and middle-income countries [59].

Developing modular (i.e., menu of common implementation strategies) implementation approaches is an opportunity for innovation and creating broadly useful strategies for tailoring to setting. These solutions are opportunities for both implementation researchers and practitioners, who apply research to transform practice. We can be partners in advancing knowledge quickly and ensuring rigor, relevance, and translatability. As humans, we are prone toward dichotomous thinking, but implementation science will be stronger if we prevent the emergence of separate “research” and “practice” ideologies. Another opportunity is a lesson from intervention development: avoid assuming “if you build it, they will come” [60].

To avoid a research-to-practice gap in implementation, we should assemble the voices of all key partners including community members, implementation researchers, and practitioners. The most effective way forward is true partnership to advance knowledge quickly and ensure rigor and relevance, rather than the emergence of “research” and “practice” camps separated by ideological lines. One solution comes from the Society for Implementation Research Collaboration (SIRC), which proposed an integrated training experience for implementation researchers, practitioners/intermediaries, practice leaders, and policy leaders to reduce the implementation research-practice gap [60]. Building on principles of pragmatic research, team science (theme 1), and interprofessional education, the approach could be a model for integrated professional development.

Theme 4: We could not balance making implementation science available to everyone while retaining the coherence of the field

Threats

A major challenge of the field relates to capacity building [61], with the goal of making implementation science more broadly available to implementation research and practice. Pressures to create traditional niches of expertise have resulted in a sometimes insular field that often requires individuals to be perceived as “card-carrying” implementation scientists to obtain funds for large-scale implementation research. If we want to meet demand, have broader impact, and formalize as a scientific field, we need more implementation scientists [62]. However, having everyone “do implementation science” has the potential to dilute the field and lessen perceived innovation and coherent science. The epistemology of science has many theories on the tension of how fields grow and thrive [63]. If we, as implementation scientists, act as gatekeepers to retain field coherence, we lose opportunities to partner with adjacent fields such as improvement [64] and intervention sciences and grow synergistically, rather than in parallel and siloed. We give up the chance to embed in learning health systems [65] and other organizations available internationally in the US [66], UK [67], and Australia — and repeatedly proposed in low- and middle-income countries [68, 69] — for synergy between implementation science-informed approaches and quality improvement, clinical informatics, and innovation [70].

Opportunities

There is a growing understanding that more implementation science capacity is needed, while retaining field coherence [71, 72]. One way that we have come to think about this includes considering the needs of three groups. First, basic scientists and early-stage translational researchers should be aware of implementation science but will likely not incorporate its approaches into their work without partnering with an implementation scientist. This group benefits from awareness of implementation science methods, which can be built into graduate and/or postdoctoral training. The second group of individuals might include implementation science in their toolkit (e.g., health services researchers, intervention developers, clinical trialists) and use established methods (e.g., hybrid designs, co-design) in their projects. This group requires foundational training. The third group are dedicated implementation scientist methodologists and advance the field with their work. These individuals require advanced specialized training. They may be most interested in a particular disease (e.g., cancer) and setting (e.g., acute care or schools) or be disease and setting agnostic, instead answering the most impactful implementation science questions. The intentional development of these different groups will promote the full range of implementation science, from basic science focused on theory and method development to applied, real-world approaches [73].

We envision a future in which all institutions, including large health systems, have a division or department of implementation scientists from the third group. We envision departments having individuals with expertise in implementation science germane to the department’s area, akin to the biostatistician model [74, 75]. Exploration of additional models for supporting institutional implementation science capacity, including leveraging Clinical and Transitional Science Award programs [73, 76, 77], is needed. This kind of growth will both democratize implementation science and promote paradigm-shaping work needed to advance the field.

Theme 5: We could not align our timelines, incentives, or priorities with our partners

Threats

Challenges in alignment with partners have been described in related fields [73, 78]. Meaningful partnership with care delivery and community settings is the backbone of implementation science [79,80,81]. Thus, to do implementation research well, we should invest in and maintain relationships with community partners and the systems that employ or serve them. We define community broadly to include administrators, staff, and clinicians from local and state governments, payers, payors, community-based organizations, and health systems, as well as community members, community leaders, community-based organizations, and patients and caregivers who are reached through them [82]. A major threat to the long-term viability of implementation science concerns alignment on timeline, priorities, and incentives between our partners and the scientific enterprise of implementation research.

First, the priorities and timeline for implementation research are often misaligned with health systems and community settings. Science can be slow [83], and once health system or community leadership sets priorities around healthcare delivery, they expect change quickly. Similarly, needs and priorities might not align with those proposed in implementation research, particularly if partners are not meaningfully integrated into the research process from the outset [82], or if inequitable power and resource dynamics exist. In addition, by the time research is completed, contexts may have shifted, new interventions may have been developed, and the lag in delivery of the most advanced healthcare solutions persists, especially for under-resourced settings.

Second, academic incentives and the transformation of health and healthcare have a fundamental tension. As is typical in academia, implementation scientists are incentivized to publish and apply for grants rather than to transform practice — widening the research-to-practice gap (theme 3). This is a longstanding issue for healthcare researchers and other academics with public or population health impact as their explicit goal [12, 84]. These alignment challenges are influenced and compounded by current funding mechanisms. This makes launching projects responsive to emergent needs challenging, creating disappointment or disillusionment in partners who are unfamiliar with grant timelines and processes.

Opportunities

We ought to move towards pragmatic implementation science which prioritizes the needs of partners [3, 85]. One model that mitigates this issue is embedded research [3], in which researchers work within settings such as health systems, funded by them to answer questions that match their priorities [86]. Instead of one-off studies, evaluation and research are built into implementation efforts for sequential, timely, rapid learning. This model allows both system change and creation of generalizable knowledge to transform other organizations and systems. An example is the creation of implementation laboratories such as the Audit and Feedback Metalab [6, 11]. This model might become the norm, not the exception. However, some care settings are chronically underfunded and under-resourced, including publicly funded mental health and smaller health service organizations, public health, and justice settings, likely limiting embedded research with them.

Another opportunity is funders and partners codesigning funding mechanisms that are responsive to the community timeline and needs and sufficient for rigorous evaluation and tangible impact. Funders have recently deployed more flexible mechanisms (e.g., National Cancer Institute Implementation Science Centers in Cancer Control [87], NIH COVID RADx initiative) by including mechanisms with more resources and support for community partners, but most traditional mechanisms typically do not align with real-world needs [88].

The power of alignment in implementation science is illustrated by the COVID-19 crisis with coalescing of political, community, public health, and health system priorities around COVID-19 prevention and care. Some implementation scientists pivoted to use rapid implementation science methods to meet their settings’ needs, exemplifying how to offer our skillset to partners in a time of need. For example, Penn Medicine used implementation mapping [89] to assemble key partners to develop five specific strategies for rapid implementation during the pandemic to improve prone positioning of patients to ameliorate COVID-19 symptoms [90]. One potential way to harness alignment is prioritizing rapid implementation science [91,92,93], which balances speed, efficiency, and rigor in implementation by adapting both methods and trial design to meet objectives [94, 95]. Rapid implementation methods will continue to gain traction if researchers and partners continue to prioritize efficiency in methods and designs while maintaining rigor.

Theme 6: Our implementation strategies and processes were too complex and not well matched to partners’ needs

Threats

Implementation strategies have progressed tremendously, including ways to classify implementation strategies conceptually [96,97,98], generation of an increasingly robust evidence base from rigorous trials, and establishment of reporting guidelines to improve rigor and reproducibility [99,100,101,102,103]. Despite these advances, complexity and misalignment with needs threaten the application of implementation strategies.

First, despite conceptual advances, our taxonomies of implementation strategies and behavior-change methods and techniques are by no means exhaustive. Fields such as behavioral economics and systems engineering offer insights on how to shape clinician decision-making under conditions of uncertainty or develop approaches that match local needs, but these approaches are underemphasized in existing taxonomies such as the Expert Recommendations for Implementing Change (ERIC) compilation [17]. Moreover, many of our strategies are not readily understandable by community partners, as they have arisen out of predominately clinical contexts, and require translation for applicability in community settings.

Second, the pressure to innovate (themes 2 and 3) can lead to rebranding, testing, and promoting implementation strategies that reinvent the wheel or represent incremental improvements or tweaks from previous studies. Rebranding and tweaking eliminates advantages of shared language [104] (theme 2) and stymies conceptual and empirical development of the field.

Third, as the field focuses on understanding implementation strategy mechanisms [105], which helps us understand how strategies work and build causal theories, we risk becoming overly reductionist. Simply the language of “implementation mechanisms” may make our science feel less relevant to community-based collaborators. Our designs and methods also may stymie progress, for example emphasizing traditional designs such as randomized controlled trials rather than designs (e.g., adaptive, rapid, systems science-based [106]) suited to developing and determining if strategies have signals of effectiveness [107] or that capture dynamic, social processes within context.

Finally, our processes to design and tailor implementation strategies are imperfect and often a mismatch for the challenges of the partners and setting [4, 108]. While the basic steps of designing and tailoring implementation strategies systematically are documented [109, 110], the process of selecting implementation strategies often requires intensive contextual inquiry and the strategies that effectively address the identified implementation determinants are unclear. Not surprisingly, partners express frustration with the lengthy process, suggesting the need for methods that balance rigor and pragmatism.

Opportunities

Numerous ways may lead to development of implementation strategies that are better matched to determinants, more understandable to our partners and more pragmatic, and more efficiently build the science of implementation. First, we can embrace systematic approaches that prompt implementers to consider what multilevel changes are required to implement, scale, and sustain interventions; what might help or hinder those changes; and how changes can be feasibly measured [103, 109]. Approaches ideally incorporate existing partner input, evidence on strategy effectiveness, and formal or informal theory that hypothesizes mechanisms of strategy operation. Considering mechanisms can ensure that strategies are as efficient as possible and allow adjustment of poorly performing strategies in subsequent efforts [25, 26, 105, 111]. Systematic methods [110, 112] include intervention (or implementation) mapping, increasingly applied to systematically design and/or tailor implementation strategies [89, 113]. Ample opportunities remain to improve these and other methods to be more pragmatic and useful to implementers.

Implementation and sustainment determinants could be more feasibly identified through systematic reviews, rapid approaches to contextual inquiry [94, 114, 115], or existing data such as information included in the electronic health record. Determining how to rapidly prioritize determinants is also important. For example, one approach being tested involves evaluating the ubiquity, chronicity, and criticality of implementation determinants to prioritize which should be explicitly addressed [116].

Improving tools to help implementers identify appropriate implementation strategies is critical. This could involve refining taxonomies of implementation strategies such as the ERIC compilation to make the content and language more usable to partners in specific contexts (e.g., school mental health, community-based organizations) [100, 117, 118]; incorporating strategies from fields such as behavioral economics [17]; noting strategy relevance to specific phases (e.g., exploration, sustainment) and translational tasks (e.g., dissemination, scale-up) [119]; and articulating strategy mechanisms [105]. Tools that match strategies and behavior-change techniques to implementation determinants could help organizations and systems tailor strategies to their needs and be improved over time to incorporate conceptual and empirical advancements [117, 120].

Additional approaches to improve fit between strategies and partner needs and contexts include using and refining strategies that are inherently adaptive, such as facilitation [121, 122]. We could leverage user-centered design and approaches such as the Multiphase Optimization Strategy (MOST), Learn as You Go (LAGO), and Sequential Multiple Assignment Randomized Implementation Trial (SMART) designs that allow us to optimize implementation strategies and calibrate the level of implementation support provided based on demonstrated need [123,124,125,126,127].

Finally, we can avoid pseudoinnovation and efficiently develop the evidence base for strategies. One way is to improve reporting of implementation strategies, for more efficiently assessing the effectiveness of strategies with similar components and hypothesized change mechanisms. In some intervention science domains, integration of findings from different programs, teams, and studies is facilitated by the identification of intervention commonalities [128]. A similar approach could synthesize “common elements” of distinct implementation strategies, so they can be flexibly applied to a range of implementation challenges [129]. Another promising approach is the “meta-laboratory” described by Grimshaw and colleagues [6] that compares different ways of providing audit and feedback. The more effective approach quickly becomes the standard of care within the health system and is compared in a subsequent trial to another audit-and-feedback strategy that may offer efficiency or effectiveness improvements. This approach may be an efficient way of developing a robust evidence base for implementation strategies.

Conclusion

We are privileged and humbled to be a part of a developing field and optimistic that it has a long and successful future. Aligned with the themes from our pre-mortem exercise, we confirm the importance of examining our assumptions, reflecting with humility, and planning the way forward as the field approaches 20 years since the launching of Implementation Science. Developmentally, we believe that the time is ripe to begin reflecting as a field. A key insight gleaned from our group is that implementation science drew us from other disciplines given its promise for enhancing population health and promoting health equity, but we find it is not immune from the threats that challenge other fields including intervention science, such as academic incentive structures and misalignment with collaborator timelines. The themes offered largely align with previous self-critical assessments from international teams outside of the US and also offers new perspectives. Synthesizing threats and opportunities from the perspective of international teams collectively is an important future activity and could be a focus of an international convening of implementation scientists [4, 5].

We see several key opportunities to enhance the future of the field and leverage the power of prospective hindsight to ensure our success. We wrote this piece as a conversation starter. We hope it generates reflection from the vantage point of other implementation partners, particularly implementation practitioners and international colleagues as our field continues to develop.

Availability of data and materials

Not applicable.

Abbreviations

NIH:

National Institutes of Health

EBP:

Evidence-based practice

SIRC:

Society for Implementation Research Collaboration

ERIC:

Expert Recommendations for Implementing Change

MOST:

Multiphase Optimization Strategy

LAGO:

Learn as You Go

SMART:

Sequential Multiple Assignment Randomized Implementation Trial

References

  1. Estabrooks PA, Brownson RC, Pronk NP. Dissemination and implementation science for public health professionals: an overview and call to action. Prev Chronic Dis. 2018;15:E162 PMCID: PMC6307829.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Braveman P. What are health disparities and health equity? We need to be clear. Public Health Rep. 2014;129 Suppl 2(Suppl 2):5–8 PMCID: PMC3863701.

    Article  PubMed  Google Scholar 

  3. Damschroder LJ, Knighton AJ, Griese E, Greene SM, Lozano P, Kilbourne AM, et al. Recommendations for strengthening the role of embedded researchers to accelerate implementation in health systems: findings from a state-of-the-art (SOTA) conference workgroup. Healthc (Amst). 2021;8 Suppl 1(Suppl 1):100455 PMCID: PMC8243415.

    Article  Google Scholar 

  4. Wensing M, Grol R. Knowledge translation in health: how implementation science could contribute more. BMC Med. 2019;17(1):88 PMCID: PMC6505277.

    Article  PubMed  PubMed Central  Google Scholar 

  5. Rapport F, Smith J, Hutchinson K, Clay-Williams R, Churruca K, Bierbaum M, et al. Too much theory and not enough practice? The challenge of implementation science application in healthcare practice. J Eval Clin Pract. 2021. https://doi.org/10.1111/jep.13600.

  6. Grimshaw JM, Ivers N, Linklater S, Foy R, Francis JJ, Gude WT, et al. Reinvigorating stagnant science: implementation laboratories and a meta-laboratory to efficiently advance the science of audit and feedback. BMJ Qual Saf. 2019;28(5):416–23 PMCID: PMC6559780.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  7. Wensing M, Sales A, Armstrong R, Wilson P. Implementation science in times of Covid-19. Implement Sci. 2020;15(1):42 PMCID: PMC7276954.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Stange KC. Commentary: RE-AIM planning and evaluation framework: adapting to new science and practice with a 20-year review. Front Public Health. 2020;8:245 PMCID: PMC7347750.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Boulton R, Sandall J, Sevdalis N. The cultural politics of ‘Implementation Science’. J Med Humanit. 2020;41(3):379–94 PMCID: PMC7343725.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Mitchell DJ, Edward Russo J, Pennington N. Back to the future: temporal perspective in the explanation of events. J Behav Decis Mak. 1989;2(1):25–38. https://doi.org/10.1002/bdm.3960020103.

    Article  Google Scholar 

  11. Ottawa Hospital Research Institute. The Audit & Feedback MetaLab. 2021. http://www.ohri.ca/auditfeedback/. Accessed 5 Aug 2021.

    Google Scholar 

  12. Luke DA, Sarli CC, Suiter AM, Carothers BJ, Combs TB, Allen JL, et al. The translational science benefits model: a new framework for assessing the health and societal benefits of clinical and translational sciences. Clin Transl Sci. 2018;11(1):77–84. https://doi.org/10.1111/cts.12495.

    Article  PubMed  Google Scholar 

  13. Brownson RC, Chriqui JF, Stamatakis KA. Understanding evidence-based public health policy. Am J Public Health. 2009;99(9):1576–83 PMCID: PMC2724448.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Eisman AB, Kilbourne AM, Dopp AR, Saldana L, Eisenberg D. Economic evaluation in implementation science: making the business case for implementation strategies. Psychiatry Res. 2020;283:112433. https://doi.org/10.1016/j.psychres.2019.06.008.

    Article  PubMed  Google Scholar 

  15. Ellinger AD, Ellinger AE, Yang B, Howton SW. The relationship between the learning organization concept and firms’ financial performance: an empirical assessment. Hum Resour Dev Q. 2002;13(1):5–22. https://doi.org/10.1002/hrdq.1010.

    Article  Google Scholar 

  16. Powell BJ, Fernandez ME, Williams NJ, Aarons GA, Beidas RS, Lewis CC, et al. Enhancing the impact of implementation strategies in healthcare: a research agenda. Front Public Health. 2019;7:3 PMCID: PMC6350272.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Beidas RS, Buttenheim AM, Mandell DS. Transforming mental health care delivery through implementation science and behavioral economics. JAMA Psychiatry. 2021;78(9):941–2. https://doi.org/10.1001/jamapsychiatry.2021.1120.

    Article  PubMed  Google Scholar 

  18. Glisson C, Williams NJ. Assessing and changing organizational social contexts for effective mental health services. Annu Rev Public Health. 2015;36:507–23. https://doi.org/10.1146/annurev-publhealth-031914-122435.

    Article  PubMed  Google Scholar 

  19. Hoagwood KE, Purtle J, Spandorfer J, Peth-Pierce R, Horwitz SM. Aligning dissemination and implementation science with health policies to improve children’s mental health. Am Psychol. 2020;75(8):1130 PMCID: PMC8034490.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Shelton RC, Adsul P, Oh A. Recommendations for addressing structural racism in implementation science: a call to the field. Ethn Dis. 2021;31(Suppl 1):357–64 PMCID: PMC8143847.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Baumann AA, Cabassa LJ. Reframing implementation science to address inequities in healthcare delivery. BMC Health Serv Res. 2020;20(1):190 PMCID: PMC7069050.

    Article  PubMed  PubMed Central  Google Scholar 

  22. Brownson RC, Kumanyika SK, Kreuter MW, Haire-Joshu D. Implementation science should give higher priority to health equity. Implement Sci. 2021;16(1):28 PMCID: PMC7977499.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Woodward EN, Matthieu MM, Uchendu US, Rogal S, Kirchner JE. The health equity implementation framework: proposal and preliminary study of hepatitis C virus treatment. Implement Sci. 2019;14(1):1–18 PMCID: PMC6417278.

    Article  Google Scholar 

  24. Glasgow RE, Riley WT. Pragmatic measures: what they are and why we need them. Am J Prev Med. 2013;45(2):237–43. https://doi.org/10.1016/j.amepre.2013.03.010.

    Article  PubMed  Google Scholar 

  25. Lewis CC, Powell BJ, Brewer SK, Nguyen AM, Schriger SH, Vejnoska SF, et al. Advancing mechanisms of implementation to accelerate sustainable evidence-based practice integration: protocol for generating a research agenda. BMJ Open. 2021;11(10):e053474 PMCID: PMC8524292.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Lewis CC, Klasnja P, Powell BJ, Lyon AR, Tuzzio L, Jones S, et al. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front Public Health. 2018;6:136. https://doi.org/10.3389/fpubh.2018.00136.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Committee on the Science of Team Science, Board on Behavioral C, and Sensory Sciences,, Division of Behavioral and Social Sciences and Education, National Research Council. Enhancing the effectiveness of team science. Washington, DC: National Academies Press (US); 2015. https://doi.org/10.17226/19007.

    Book  Google Scholar 

  28. Shelton RC, Adsul P, Oh A, Moise N, Griffith D. Application of an anti-racism lens in the field of implementation science: recommendations for reframing implementation research with a focus on justice and racial equity. Implement Res Pract. 2021. https://doi.org/10.1177/26334895211049482.

  29. Emmons KM, Chambers DA. Policy implementation science - an unexplored strategy to address social determinants of health. Ethn Dis. 2021;31(1):133–8. https://doi.org/10.18865/ed.31.1.133.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Bailey ZD, Krieger N, Agénor M, Graves J, Linos N, Bassett MT. Structural racism and health inequities in the USA: evidence and interventions. Lancet. 2017;389(10077):1453–63. https://doi.org/10.1016/s0140-6736(17)30569-x.

    Article  PubMed  Google Scholar 

  31. Garner BR, Patel SV, Kirk MA. Priority domains, aims, and testable hypotheses for implementation research: protocol for a scoping review and evidence map. Syst Rev. 2020;9(1):277 PMCID: PMC7716483.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Hoomans T, Severens JL. Economic evaluation of implementation strategies in health care. Implement Sci. 2014;9(168). https://implementationscience.biomedcentral.com/articles/10.1186/s13012-014-0168-y#article-info.

  33. Roberts SLE, Healey A, Sevdalis N. Use of health economic evaluation in the implementation and improvement science fields - a systematic literature review. Implement Sci. 2019;14(1):72 PMCID: PMC6631608.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Noe MH, Shin DB, Wan MT, Gelfand JM. Objective measures of psoriasis severity predict mortality: a prospective population-based cohort study. J Invest Dermatol. 2018;138(1):228–30 PMCID: PMC6748628.

    Article  CAS  PubMed  Google Scholar 

  35. Paltiel AD, Schwartz JL, Zheng A, Walensky RP. Clinical outcomes of a COVID-19 vaccine: implementation over efficacy. Health Aff (Millwood). 2021;40(1):42–52 PMCID: PMC7931245.

    Article  Google Scholar 

  36. Neta G, Glasgow RE, Carpenter CR, Grimshaw JM, Rabin BA, Fernandez ME, et al. A framework for enhancing the value of research for dissemination and implementation. Am J Public Health. 2015;105(1):49–57 PMCID: PMC4265905.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Bennett GG, Shelton RC. Extending our reach for greater impact. Health Educ Behav. 2017;44(6):835–8. https://doi.org/10.1177/1090198117736354.

    Article  PubMed  Google Scholar 

  38. Mired PD. Covid-19 unmasks what cancer patients have long known: the world needs a better and more equitable health system: Cable News Network; 2020. https://www.cnn.com/2020/06/20/opinions/covid-19-unmasks-what-cancer-patients-have-long-known/index.html. Accessed 5 Aug 2021

    Google Scholar 

  39. Hacking I. Introduction. In: Kuhn TS, editor. The structure of scientific revolutions: 50th anniversary. 4th ed: University of Chicago Press; 2012. https://doi.org/10.1177/0048393112473424.

    Chapter  Google Scholar 

  40. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38(2):65–76 PMCID: PMC3068522.

    Article  PubMed  Google Scholar 

  41. Martinez K, Callejas L, Hernandez M. Community-defined evidence: a bottom-up behavioral health approach to measure what works in communities of color; 2010.

    Google Scholar 

  42. Chambers DA. Considering the intersection between implementation science and COVID-19. Implement Res Pract. 2020;1. https://doi.org/10.1177/0020764020925994.

  43. Taylor SP, Kowalkowski MA, Beidas RS. Where is the implementation science? An opportunity to apply principles during the COVID-19 pandemic. Clin Infect Dis. 2020;71(11):2993–5 PMCID: PMC7314225.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  44. Shelton RC, Brotzman LE, Johnson D, Erwin D. Trust and mistrust in shaping adaptation and de-implementation in the context of changing screening guidelines. Ethn Dis. 2021;31(1):119–32 PMCID: PMC7843040.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Zhuang H, Acuna DE. The effect of novelty on the future impact of scientific grants. Arxiv. 2019. https://doi.org/10.48550/arXiv.1911.02712.

  46. Kislov R, Pope C, Martin GP, Wilson PM. Harnessing the power of theorising in implementation science. Implement Sci. 2019;14(1):1–8 PMCID: PMC6905028.

    Article  Google Scholar 

  47. Glasgow RE, Harden SM, Gaglio B, Rabin B, Smith ML, Porter GC, et al. RE-AIM planning and evaluation framework: adapting to new science and practice with a 20-year review. Front Public Health. 2019;7:64 PMCID: PMC6450067.

    Article  PubMed  PubMed Central  Google Scholar 

  48. Strifler L, Cardoso R, McGowan J, Cogo E, Nincic V, Khan PA, et al. Scoping review identifies significant number of knowledge translation theories, models, and frameworks with limited use. J Clin Epidemiol. 2018;100:92–102. https://doi.org/10.1016/j.jclinepi.2018.04.008.

    Article  PubMed  Google Scholar 

  49. Allen M, Wilhelm A, Ortega LE, Pergament S, Bates N, Cunningham B. Applying a race (ism)-conscious adaptation of the CFIR Framework to understand implementation of a school-based equity-oriented intervention. Ethn Dis. 2021;31(Suppl 1):375–88 PMCID: PMC8143857.

    Article  PubMed  PubMed Central  Google Scholar 

  50. Shelton RC, Chambers DA, Glasgow RE. An extension of RE-AIM to enhance sustainability: addressing dynamic context and promoting health equity over time. Front Public Health. 2020;8:134 PMCID: PMC7235159.

    Article  PubMed  PubMed Central  Google Scholar 

  51. Etherington N, Rodrigues IB, Giangregorio L, Graham ID, Hoens AM, Kasperavicius D, et al. Applying an intersectionality lens to the theoretical domains framework: a tool for thinking about how intersecting social identities and structures of power influence behaviour. BMC Med Res Methodol. 2020;20(1):169 PMCID: PMC7318508.

    Article  PubMed  PubMed Central  Google Scholar 

  52. Snell-Rood C, Jaramillo ET, Hamilton AB, Raskin SE, Nicosia FM, Willging C. Advancing health equity through a theoretically critical implementation science. Transl Behav Med. 2021;11(8):1617–25 PMCID: PMC8367016.

    Article  PubMed  PubMed Central  Google Scholar 

  53. Westerlund A, Nilsen P, Sundberg L. Implementation of implementation science knowledge: the research-practice gap paradox. Worldviews Evid Based Nurs. 2019;16(5):332 PMCID: PMC6899530.

    Article  PubMed  PubMed Central  Google Scholar 

  54. Green LW. Making research relevant: if it is an evidence-based practice, where’s the practice-based evidence? Fam Pract. 2008;25(Suppl 1):i20–4. https://doi.org/10.1093/fampra/cmn055.

    Article  PubMed  Google Scholar 

  55. Lyon AR, Coifman J, Cook H, McRee E, Liu FF, Ludwig K, et al. The Cognitive Walkthrough for Implementation Strategies (CWIS): a pragmatic method for assessing implementation strategy usability. Implement Sci Commun. 2021;2(1):1–16 PMCID: PMC8285864.

    Article  Google Scholar 

  56. Dopp AR, Parisi KE, Munson SA, Lyon AR. Aligning implementation and user-centered design strategies to enhance the impact of health services: results from a concept mapping study. Implement Sci Commun. 2020;1(1):17 PMCID: PMC7427975.

    Article  PubMed  PubMed Central  Google Scholar 

  57. Williams NJ, Beidas RS. Annual research review: the state of implementation science in child psychology and psychiatry: a review and suggestions to advance the field. J Child Psychol Psychiatry. 2019;60(4):430–50 PMCID: PMC6389440.

    Article  PubMed  Google Scholar 

  58. Kemp CG, Wagenaar BH, Haroz EE. Expanding hybrid studies for implementation research: intervention, implementation strategy, and context. Front Public Health. 2019;7:325 PMCID: PMC6857476.

    Article  PubMed  PubMed Central  Google Scholar 

  59. Safaeinili N, Brown-Johnson C, Shaw JG, Mahoney M, Winget M. CFIR simplified: pragmatic application of and adaptations to the Consolidated Framework for Implementation Research (CFIR) for evaluation of a patient-centered care transformation within a learning health system. Learn Health Syst. 2020;4(1):e10201 PMCID: PMC6971122.

    PubMed  Google Scholar 

  60. Lyon AR, Comtois KA, Kerns SEU, Landes SJ, Lewis CC. Closing the science–practice gap in implementation before it widens. Implement Sci. 2020;3.0:295–313. https://doi.org/10.1007/978-3-030-03874-8_12.

    Article  Google Scholar 

  61. Davis R, D’Lima D. Building capacity in dissemination and implementation science: a systematic review of the academic literature on teaching and training initiatives. Implement Sci. 2020;15(1):1–26 PMCID: PMC7597006.

    Article  Google Scholar 

  62. Herrera M, Roberts DC, Gulbahce N. Mapping the evolution of scientific fields. PLoS One. 2010;5(5):e10355 PMCID: PMC2864309.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  63. Coccia M. The evolution of scientific disciplines in applied sciences: dynamics and empirical properties of experimental physics. Scientometrics. 2020;124(1):451–87. https://doi.org/10.1007/s11192-020-03464-y.

    Article  Google Scholar 

  64. Koczwara B, Stover AM, Davies L, Davis MM, Fleisher L, Ramanadhan S, et al. Harnessing the synergy between improvement science and implementation science in cancer: a call to action. J Oncol. 2018;14(6):335 PMCID: PMC6075851.

    Google Scholar 

  65. Chambers DA, Feero WG, Khoury MJ. Convergence of implementation science, precision medicine, and the learning health care system: a new model for biomedical research. JAMA. 2016;315(18):1941–2 PMCID: PMC5624312.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  66. Gould MK, Sharp AL, Nguyen HQ, Hahn EE, Mittman BS, Shen E, et al. Embedded research in the learning healthcare system: ongoing challenges and recommendations for researchers, clinicians, and health system leaders. J Gen Intern Med. 2020;35(12):3675–80 PMCID: PMC7728937.

    Article  PubMed  PubMed Central  Google Scholar 

  67. Cheetham M, Wiseman A, Khazaeli B, Gibson E, Gray P, Van der Graaf P, et al. Embedded research: a promising way to create evidence-informed impact in public health? J Public Health (Oxf). 2018;40(Suppl 1):i64–70. https://doi.org/10.1093/pubmed/fdx125.

    Article  CAS  Google Scholar 

  68. English M, Irimu G, Agweyu A, Gathara D, Oliwa J, Ayieko P, et al. Building learning health systems to accelerate research and improve outcomes of clinical care in low- and middle-income countries. PLoS Med. 2016;13(4):e1001991 PMCID: PMC4829240.

    Article  PubMed  PubMed Central  Google Scholar 

  69. Sheikh K, Agyepong I, Jhalani M, Ammar W, Hafeez A, Pyakuryal S, et al. Learning health systems: an empowering agenda for low-income and middle-income countries. Lancet. 2020;395(10223):476–7. https://doi.org/10.1016/s0140-6736(19)33134-4.

    Article  PubMed  Google Scholar 

  70. Proctor EK, Chambers DA. Training in dissemination and implementation research: a field-wide perspective. Transl Behav Med. 2017;7(3):624–35 PMCID: PMC5645271.

    Article  PubMed  Google Scholar 

  71. Leppin AL, Baumann AA, Fernandez ME, Rudd BN, Stevens KR, Warner DO, et al. Teaching for implementation: a framework for building implementation research and practice capacity within the translational science workforce. J Clin Transl Sci. 2021;5(1):e147 PMCID: PMC8411269.

    Article  PubMed  PubMed Central  Google Scholar 

  72. Leppin AL, Mahoney JE, Stevens KR, Bartels SJ, Baldwin LM, Dolor RJ, et al. Situating dissemination and implementation sciences within and across the translational research spectrum. J Clin Transl Sci. 2019;4(3):152–8 PMCID: PMC7348034.

    Article  PubMed  PubMed Central  Google Scholar 

  73. Tabak RG, Bauman AA, Holtrop JS. Roles dissemination and implementation scientists can play in supporting research teams. Implement Sci Commun. 2021;2(1):9 PMCID: PMC7811259.

    Article  PubMed  PubMed Central  Google Scholar 

  74. Lee KJ, Moreno-Betancur M, Kasza J, Marschner IC, Barnett AG, Carlin JB. Biostatistics: a fundamental discipline at the core of modern health data science. Med J Aust. 2019;211(10):444 PMCID: PMC6899583.

    Article  PubMed  PubMed Central  Google Scholar 

  75. Brimacombe MB. Biostatistical and medical statistics graduate education. BMC Med Educ. 2014;14(1):1–5 PMCID: PMC3907662.

    Article  Google Scholar 

  76. Mehta TG, Mahoney J, Leppin AL, Stevens KR, Yousefi-Nooraie R, Pollock BH, et al. Integrating dissemination and implementation sciences within Clinical and Translational Science Award programs to advance translational research: recommendations to national and local leaders. J Clin Transl Sci. 2021;5(1):e151 PMCID: PMC8411263.

    Article  PubMed  PubMed Central  Google Scholar 

  77. Tabak RG, Padek MM, Kerner JF, Stange KC, Proctor EK, Dobbins MJ, et al. Dissemination and implementation science training needs: insights from practitioners and researchers. Am J Prev Med. 2017;52(Suppl 3):S322–S29 PMCID: PMC5321656.

    Article  PubMed  PubMed Central  Google Scholar 

  78. Jones L, Wells K, Norris K, Meade B, Koegel P. The vision, valley, and victory of community engagement. Ethn Dis. 2009;19(4 Suppl 6):S6-3-7 PMCID: PMC4841676.

    PubMed  Google Scholar 

  79. Ramanadhan S, Davis MM, Armstrong R, Baquero B, Ko LK, Leng JC, et al. Participatory implementation science to increase the impact of evidence-based cancer prevention and control. Cancer Causes Control. 2018;29(3):363–9 PMCID: PMC5858707.

    Article  PubMed  PubMed Central  Google Scholar 

  80. Chambers DA, Azrin ST. Research and services partnerships: a fundamental component of dissemination and implementation research. Psychiatr Serv. 2013;64(6):509–11. https://doi.org/10.1176/appi.ps.201300032.

    Article  PubMed  Google Scholar 

  81. Pellecchia M, Mandell DS, Nuske HJ, Azad G, Benjamin Wolk C, Maddox BB, et al. Community–academic partnerships in implementation research. J Community Psychol. 2018;46(7):941–52. https://doi.org/10.1002/jcop.21981.

    Article  PubMed  Google Scholar 

  82. Wallerstein N, Duran B. Community-based participatory research contributions to intervention research: the intersection of science and practice to improve health equity. Am J Public Health. 2010;100 Suppl 1(Suppl 1):S40–6 PMCID: PMC2837458.

    Article  PubMed  Google Scholar 

  83. Khan S, Chambers D, Neta G. Revisiting time to translation: implementation of evidence-based practices (EBPs) in cancer control. Cancer Causes Control. 2021;32(3):221–30. https://doi.org/10.1007/s10552-020-01376-z.

    Article  PubMed  Google Scholar 

  84. Calleson DC, Jordan C, Seifer SD. Community-engaged scholarship: is faculty work in communities a true academic enterprise? Acad Med. 2005;80(4):317–21.

    Article  PubMed  Google Scholar 

  85. Ghaffar A, Langlois EV, Rasanathan K, Peterson S, Adedokun L, Tran NT. Strengthening health systems through embedded research. Bull World Health Organ. 2017;95(2):87 PMCID: PMC5327943.

    Article  PubMed  PubMed Central  Google Scholar 

  86. Wolfenden L, Yoong SL, Williams CM, Grimshaw J, Durrheim DN, Gillham K, et al. Embedding researchers in health service organizations improves research translation and health service performance: the Australian Hunter New England Population Health example. J Clin Epidemiol. 2017;85:3–11. https://doi.org/10.1016/j.jclinepi.2017.03.007.

    Article  PubMed  Google Scholar 

  87. Oh A, Vinson CA, Chambers DA. Future directions for implementation science at the National Cancer Institute: implementation science centers in cancer control. Transl Behav Med. 2020;11(2):669–75 PMCID: PMC8135092.

    Article  PubMed Central  Google Scholar 

  88. Frazier SL, Formoso D, Birman D, Atkins MS. Closing the research to practice gap: redefining feasibility. Clin Psychol. 2008;15(2):125. https://doi.org/10.1111/j.1468-2850.2008.00120.x.

    Article  Google Scholar 

  89. Fernandez ME, Ten Hoor GA, van Lieshout S, Rodriguez SA, Beidas RS, Parcel G, et al. Implementation mapping: using intervention mapping to develop implementation strategies. Front Public Health. 2019;7:158 PMCID: PMC6592155.

    Article  PubMed  PubMed Central  Google Scholar 

  90. Klaiman T, Silvestri JA, Srinivasan T, Szymanski S, Tran T, Oredeko F, et al. Improving prone positioning for severe acute respiratory distress syndrome during the COVID-19 pandemic. An implementation-mapping approach. Ann Am Thorac Soc. 2021;18(2):300–7 PMCID: PMC7869786.

    Article  PubMed  PubMed Central  Google Scholar 

  91. Glasgow RE, Battaglia C, McCreight M, Ayele RA, Rabin BA. Making implementation science more rapid: use of the RE-AIM framework for mid-course adaptations across five health services research projects in the veterans health administration. Front Public Health. 2020;8:194 PMCID: PMC7266866.

    Article  PubMed  PubMed Central  Google Scholar 

  92. Smith J, Rapport F, O’Brien TA, Smith S, Tyrrell VJ, Mould EV, et al. The rise of rapid implementation: a worked example of solving an existing problem with a new method by combining concept analysis with a systematic integrative review. BMC Health Serv Res. 2020;20(1):449 PMCID: PMC7240003.

    Article  PubMed  PubMed Central  Google Scholar 

  93. Øvretveit J. Implementation researchers can improve the responses of services to the COVID-19 pandemic. Implement Res Pract. 2020;1:2633489520949151 PMCID: PMC7468666.

    PubMed  PubMed Central  Google Scholar 

  94. Nevedal AL, Reardon CM, Opra Widerquist MA, Jackson GL, Cutrona SL, White BS, et al. Rapid versus traditional qualitative analysis using the Consolidated Framework for Implementation Research (CFIR). Implement Sci. 2021;16(1):67 PMCID: PMC8252308.

    Article  PubMed  PubMed Central  Google Scholar 

  95. Ramanadhan S, Revette AC, Lee RM, Aveling EL. Pragmatic approaches to analyzing qualitative data for implementation science: an introduction. Implement Sci Commun. 2021;2(1):70 PMCID: PMC8243847.

    Article  PubMed  PubMed Central  Google Scholar 

  96. Slaughter SE, Zimmermann GL, Nuspl M, Hanson HM, Albrecht L, Esmail R, et al. Classification schemes for knowledge translation interventions: a practical resource for researchers. BMC Med Res Methodol. 2017;17(1):1–11 PMCID: PMC5718087.

    Article  Google Scholar 

  97. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(1):1–14 PMCID: PMC4328074.

    Article  Google Scholar 

  98. Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, et al. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implement Sci. 2015;10:109 PMCID: PMC4527340.

    Article  PubMed  PubMed Central  Google Scholar 

  99. Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;348:g1687. https://doi.org/10.1136/bmj.g1687.

    Article  PubMed  Google Scholar 

  100. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8(1):1–11 PMCID: PMC3882890.

    Article  Google Scholar 

  101. Bragge P, Grimshaw JM, Lokker C, Colquhoun H. AIMD - a validated, simplified framework of interventions to promote and integrate evidence into health practices, systems, and policies. BMC Med Res Methodol. 2017;17(1):1–11 PMCID: PMC5336675.

    Article  Google Scholar 

  102. Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, et al. Standards for Reporting Implementation Studies (StaRI) statement. BMJ. 2017;356:i6795 PMCID: PMC5421438.

    Article  PubMed  PubMed Central  Google Scholar 

  103. Wolfenden L, Foy R, Presseau J, Grimshaw JM, Ivers NM, Powell BJ, et al. Designing and undertaking randomised implementation trials: guide for researchers. BMJ. 2021;372:m3721 PMCID: PMC7812444.

    Article  PubMed  PubMed Central  Google Scholar 

  104. McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, et al. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: a Tower of Babel? Implement Sci. 2010;5:16 PMCID: PMC2834600.

    Article  PubMed  PubMed Central  Google Scholar 

  105. Lewis CC, Boyd MR, Walsh-Bailey C, Lyon AR, Beidas R, Mittman B, et al. A systematic review of empirical studies examining mechanisms of implementation in health. Implement Sci. 2020;15(1):21 PMCID: PMC7164241.

    Article  PubMed  PubMed Central  Google Scholar 

  106. Burke JG, Lich KH, Neal JW, Meissner HI, Yonas M, Mabry PL. Enhancing dissemination and implementation research using systems science methods. Int J Behav Med. 2015;22(3):283–91 PMCID: PMC4363012.

    Article  PubMed  PubMed Central  Google Scholar 

  107. Hekler EB, Klasnja P, Riley WT, Buman MP, Huberty J, Rivera DE, et al. Agile science: creating useful products for behavior change in the real world. Transl Behav Med. 2016;6(2):317–28 PMCID: PMC4927453.

    Article  PubMed  PubMed Central  Google Scholar 

  108. Bosch M, Van Der Weijden T, Wensing M, Grol R. Tailoring quality improvement interventions to identified barriers: a multiple case analysis. J Eval Clin Pract. 2007;13(2):161–8. https://doi.org/10.1111/j.1365-2753.2006.00660.x.

    Article  PubMed  Google Scholar 

  109. French SD, Green SE, O’Connor DA, McKenzie JE, Francis JJ, Michie S, et al. Developing theory-informed behaviour change interventions to implement evidence into practice: a systematic approach using the Theoretical Domains Framework. Implement Sci. 2012;7(1):1–8. https://doi.org/10.1186/1748-5908-7-38.

    Article  Google Scholar 

  110. Colquhoun HL, Squires JE, Kolehmainen N, Fraser C, Grimshaw JM. Methods for designing interventions to change healthcare professionals’ behaviour: a systematic review. Implement Sci. 2017;12(1):30 PMCID: PMC5336662.

    Article  PubMed  PubMed Central  Google Scholar 

  111. Williams NJ. Multilevel mechanisms of implementation strategies in mental health: integrating theory, research, and practice. Adm Policy Ment Health. 2016;43(5):783–98 PMCID: PMC4834058.

    Article  PubMed  PubMed Central  Google Scholar 

  112. Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, et al. Methods to improve the selection and tailoring of implementation strategies. J Behav Health Serv Res. 2017;44(2):177–94 PMCID: PMC4761530.

    Article  PubMed  PubMed Central  Google Scholar 

  113. Powell BJ, Haley AD, Patel SV, Amaya-Jackson L, Glienke B, Blythe M, et al. Improving the implementation and sustainment of evidence-based practices in community mental health organizations: a study protocol for a matched-pair cluster randomized pilot study of the Collaborative Organizational Approach to Selecting and Tailoring Implementation Strategies (COAST-IS). Implement. Sci Commun. 2020;1(9):1–13 PMCID: PMC7207049.

    Google Scholar 

  114. Finley EP, Huynh AK, Farmer MM, Bean-Mayberry B, Moin T, Oishi SM, et al. Periodic reflections: a method of guided discussions for documenting implementation phenomena. BMC Med Res Methodol. 2018;18(1):153 PMCID: PMC6258449.

    Article  PubMed  PubMed Central  Google Scholar 

  115. Hamilton AB, Finley EP. Qualitative methods in implementation research: an introduction. Psychiatry Res. 2019;280:112516 PMCID: PMC7023962.

    Article  PubMed  PubMed Central  Google Scholar 

  116. Lewis CC, Hannon PA, Klasnja P, Baldwin LM, Hawkes R, Blackmer J, et al. Optimizing Implementation in Cancer Control (OPTICC): protocol for an implementation science center. Implement Sci Commun. 2021;2(1):44 PMCID: PMC8062945.

    Article  PubMed  PubMed Central  Google Scholar 

  117. Cook CR, Lyon AR, Locke J, Waltz T, Powell BJ. Adapting a compilation of implementation strategies to advance school-based implementation research and practice. Prev Sci. 2019;20(6):914–35. https://doi.org/10.1007/s11121-019-01017-1.

    Article  PubMed  PubMed Central  Google Scholar 

  118. Lyon AR, Cook CR, Locke J, Davis C, Powell BJ, Waltz TJ. Importance and feasibility of an adapted set of implementation strategies in schools. J Sch Psychol. 2019;76:66–77 PMCID: PMC6876555.

    Article  PubMed  PubMed Central  Google Scholar 

  119. Leeman J, Birken SA, Powell BJ, Rohweder C, Shea CM. Beyond “implementation strategies”: classifying the full range of strategies used in implementation science and practice. Implement Sci. 2017;12(1):125. https://doi.org/10.1186/s13012-017-0657-x.

    Article  PubMed  PubMed Central  Google Scholar 

  120. Human Behaviour Change Project. The theory and techniques tool. 2021. https://theoryandtechniquetool.humanbehaviourchange.org/. Accessed 5 Aug 2021.

    Google Scholar 

  121. Ritchie MJ, Kirchner JE, Parker LE, Curran GM, Fortney JC, Pitcock JA, et al. Evaluation of an implementation facilitation strategy for settings that experience significant implementation barriers. Implement Sci. 2015;10(1):1–3 PMCID: PMC4551776.

    Google Scholar 

  122. Ritchie MJ, Dollar KM, Miller CJ, Smith JL, Oliver KA, Kim B, et al. Using implementation facilitation to improve healthcare (version 3): Veterans Health Administration; 2020. https://www.queri.research.va.gov/tools/Facilitation-Manual.pdf. Accessed 24 Nov 2021

    Google Scholar 

  123. Collins LM, Baker TB, Mermelstein RJ, Piper ME, Jorenby DE, Smith SS, et al. The multiphase optimization strategy for engineering effective tobacco use interventions. Ann Behav Med. 2011;41(2):208–26 PMCID: PMC3053423.

    Article  PubMed  Google Scholar 

  124. Collins LM. Optimization of behavioral, biobehavioral, and biomedical interventions: the multiphase optimization strategy (MOST). Switzerland: Springer International Publishing; 2018. https://doi.org/10.1007/978-3-319-72206-1.

    Book  Google Scholar 

  125. Collins LM, Murphy SA, Strecher V. The multiphase optimization strategy (MOST) and the sequential multiple assignment randomized trial (SMART): new methods for more potent eHealth interventions. Am J Prev Med. 2007;32(Suppl 5):S112–8 PMCID: PMC2062525.

    Article  PubMed  PubMed Central  Google Scholar 

  126. Beidas RS, Williams NJ, Becker-Haimes EM, Aarons GA, Barg FK, Evans AC, et al. A repeated cross-sectional study of clinicians’ use of psychotherapy techniques during 5 years of a system-wide effort to implement evidence-based practices in Philadelphia. Implement Sci. 2019;14(1):67 PMCID: PMC6588873.

    Article  PubMed  PubMed Central  Google Scholar 

  127. Daniel N, Judith JL, Spiegelman D. Analysis of “learn-as-you-go” (LAGO) studies. Ann Stat. 2019;49(2):793–819. https://doi.org/10.1214/20-AOS1978.

    Article  Google Scholar 

  128. Chorpita BF, Daleiden EL. Mapping evidence-based treatments for children and adolescents: application of the distillation and matching model to 615 treatments from 322 randomized trials. J Consult Clin Psychol. 2009;77(3):566–79. https://doi.org/10.1037/a0014565.

    Article  PubMed  Google Scholar 

  129. Engell T, Kirkøen B, Hammerstrøm KT, Kornør H, Ludvigsen KH, Hagen KA. Common elements of practice, process and implementation in out-of-school-time academic interventions for at-risk children: a systematic review. Prev Sci. 2020;21(4):545–56 PMCID: PMC7162823.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

Thank you to Chris Tachibana, PhD, for her careful edits and suggestions. Thank you to Christina Johnson, BS, Kelly Zentgraf, MS, Chynna Mills, BS, and Jacqueline Buck, BS for their editorial support.

Funding

Not applicable.

Author information

Authors and Affiliations

Authors

Contributions

RSB and MBLF convened the Penn Implementation Science Summit to carry out the pre-mortem exercise. RSB drafted the initial version of the manuscript. BJP drafted most of theme 6. SD, CCL, ARL, BJP, JP, LS, RCS, SWS, and MBLF provided substantive edits and all read and approved the final manuscript.

Authors’ information

RSB is the Founding Director of the Penn Implementation Science Center at the Leonard Davis Institute of Health Economics and Director of the Penn Medicine Nudge Unit; Associate Director of the Center for Health Incentives and Behavioral Economics; and Professor of Psychiatry, Medical Ethics and Health Policy, and Medicine at the University of Pennsylvania. As of September 1, 2022, she will be Ralph Seal Paffenbarger Professor and Chair of Medical Social Sciences at Northwestern University Feinberg School of Medicine.

SD is a Professor and Associate Chair in the Department of Psychology at the University of Washington and Adjunct Professor in Global Health and Psychiatry and Behavioral Sciences.

CCL is a Senior Investigator for the Center for Accelerating Care Transformation at the Kaiser Permanente Washington Health Research Institute; Affiliate Faculty in the University of Washington Department of Psychiatry and Behavioral Sciences; and Co-Director of Kaiser Permanente’s Social Needs Network for Evaluation and Translation.

ARL is a Professor in the University of Washington (UW) Department of Psychiatry and Behavioral Sciences; Co-Director of the UW School Mental Health Assessment, Research, and Training (SMART) Center; Co-Director of the UW ALACRITY Center; and Director of the UW Research Institute for Implementation Science in Education (RIISE).

BJP is Associate Professor, Brown School; Associate Professor of Medicine, Division of Infectious Diseases, John T. Milliken Department of Medicine; Co-Director, Center for Mental Health Services Research, Brown School; and  Associate Director, Center for Dissemination and Implementation, Institute for Public Health at Washington University in St. Louis.

JP is an Associate Professor in the Department of Public Health Policy & Management at the New York University School of Global Public Health and Director of Policy Research of the NYU Global Center for Implementation Science.

LS is a Senior Scientist at the Oregon Social Learning Center.

RCS is an Associate Professor in the Department of Sociomedical Sciences at Columbia’s Mailman School of Public Health, Co-Director of the Community Engagement Core Resource, and Director of the Implementation Science Initiative at Columbia University’s Irving Institute CTSA.

SWS is an Associate Professor of Psychiatry and Behavioral Sciences at Stanford University and Member-At-Large, Board of Directors, American Psychological Association.

MBLF is a Director of Acute Care Implementation Science of the Penn Implementation Science Center at the Leonard Davis Institute of Health Economics; Co-Director of the Penn Center for Perioperative Outcomes Research and Transformation; and David E. Longnecker Associate Professor of Anesthesiology and Critical Care and Associate Professor of Epidemiology at the Perelman School of Medicine of the University of Pennsylvania. She is the Vice President of the Anesthesia Patient Safety Foundation.

Corresponding author

Correspondence to Rinad S. Beidas.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare the following competing interests: RSB is an Associate Editor and BJP is on the Editorial Board of Implementation Science; all decisions on this paper were made by other editors. RSB is a principal at Implementation Science & Practice, LLC. She receives royalties from Oxford University Press and consulting fees from United Behavioral Health and OptumLabs and serves on the advisory boards for Optum Behavioral Health, AIM Youth Mental Health Foundation, and the Klingenstein Third Generation Foundation outside of the submitted work. MBLF serves on the Board of Directors for the Foundation for Anesthesia Education and Research.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

Details about the pre-mortem exercise.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Beidas, R.S., Dorsey, S., Lewis, C.C. et al. Promises and pitfalls in implementation science from the perspective of US-based researchers: learning from a pre-mortem. Implementation Sci 17, 55 (2022). https://doi.org/10.1186/s13012-022-01226-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-022-01226-3

Keywords