Skip to main content

Implementation science issues in understanding, collecting, and using cost estimates: a multi-stakeholder perspective

Abstract

Understanding the resources needed to achieve desired implementation and effectiveness outcomes is essential to implementing and sustaining evidence-based practices (EBPs). Despite this frequent observation, cost and economic measurement and reporting are rare, but becoming more frequent in implementation science, and when present is seldom reported from the perspective of multiple stakeholders (e.g., the organization, supervisory team), including those who will ultimately implement and sustain EBPs.

Incorporating a multi-level framework is useful for understanding and integrating the perspectives and priorities of the diverse set of stakeholders involved in implementation. Stakeholders across levels, from patients to delivery staff to health systems, experience different economic impacts (costs, benefit, and value) related to EBP implementation and have different perspectives on these issues. Economic theory can aid in understanding multi-level perspectives and approaches to addressing potential conflict across perspectives.

This paper provides examples of key cost components especially important to different types of stakeholders. It provides specific guidance and recommendations for cost assessment activities that address the concerns of various stakeholder groups, identifies areas of agreement and conflict in priorities, and outlines theoretically informed approaches to understanding conflicts among stakeholder groups and processes to address them. Involving stakeholders throughout the implementation process and presenting economic information in ways that are clear and meaningful to different stakeholder groups can aid in maximizing benefits within the context of limited resources. We posit that such approaches are vital to advancing economic evaluation in implementation science. Finally, we identify directions for future research and application.

Considering a range of stakeholders is critical to informing economic evaluation that will support appropriate decisions about resource allocation across contexts to inform decisions about successful adoption, implementation, and sustainment. Not all perspectives need to be addressed in a given project but identifying and understanding perspectives of multiple groups of key stakeholders including patients and direct implementation staff not often explicitly considered in traditional economic evaluation are needed in implementation research.

Peer Review reports

Introduction

An integral component of assessing value for implementation efforts is capturing the costs and benefits of implementing evidence-based practices (EBPs) relevant to various groups of stakeholders [1]. Estimating costs and conducting comparative economic analyses, such as cost-effectiveness analysis and budget impact analysis, can be undertaken from several different perspectives. The perspectives taken in an economic evaluation depend on many things, including the primary objective, the clinical or community context, and relevant decision-makers and stakeholders who will utilize the information [2]. Many potential costs exist related to implementing EBPs, but the study perspective is the key determinant of which costs will be relevant to the analysis [3]. Although the traditional “reference case” perspectives for estimating and evaluating costs and benefits include the societal and healthcare sector perspectives [2], the Second Panel on Cost-Effectiveness in Health and Medicine also recommends conducting economic evaluations from the narrower perspectives related to the particular interests of key stakeholders [4]. But the perspectives included also reflect value judgments within an economic evaluation; that is, which perspective or perspectives are most important or relevant [3]. When conducting an economic evaluation in implementation science, then, how do we decide which perspective or perspectives to include? The pragmatic answer is the perspectives of the stakeholder and decision-makers who will be informed by the analysis should be prioritized [3].

Local economic considerations are a critical factor influencing whether or not individuals and organizations, versus larger healthcare systems and society, adopt and sustain EBPs [5]. The burden of implementation costs is often borne by local organizations and decision-makers; information from this perspective is key to informing resource allocation that can impact how well EBPs are adopted and sustained [6]. Healthcare and other service areas (e.g., education) are plagued by a persistent, largely unexplained failure to launch EBPs—even when there is strong empirical support for the value of the intervention. And, even if adopted, they are frequently not sustained [7, 8]. This inconsistency may be, in part, because the different incentive structures facing key stakeholders are not typically considered in the economic evaluations, including within implementation science.

Implementation costs and effects accrue at different levels and are variable across different parties. Costs across these different levels have a notable impact on implementation success and sustainment. A recent systematic review of implementation costing methods by Bowser et al. reflects this reality; researchers found a wide range of costing perspectives represented in implementation research, including organizational, facility, provider, and societal perspectives [9]. Yet, traditional economic evaluation does not always account for costs across these levels. Implementation failure is due often to organizations not being prepared to invest in and support effective EBP implementation or not understanding a priori what costs across what levels, given their organizational structure, they will accrue. Organizations adopting EBPs need to know what it will cost THEM—in their setting, given the resources available, with consideration of staff capacity, workflows, and patient/participant population [6, 10]. Implementation involves different types of people with diverse values and perspectives who are collectively deciding whether or not to adopt and ultimately sustain the EBPs that researchers have developed. We have a critical need to focus on the perspectives most relevant to real-world decision-making and especially implementation and sustainment, such as the patient, clinician, delivery staff, organizational, or payer perspectives [11]. In particular, the societal perspective typically used for cost-effectiveness analysis in healthcare aggregates diverse stakeholder perspectives into a single, global perspective, but the accrual of value is often unevenly distributed across stakeholder groups. Each of these stakeholder groups, however, has different perspectives and different values on which they base economic decisions [12].

The purposes of this paper are to (1) apply a multi-level framework to identify key types of stakeholders in implementation efforts and what cost information (broadly defined) they need for decision making; (2) discuss how we consider simultaneously multiple perspectives and options for applying economic theory to advance coordination across perspectives; (3) address the role of stakeholder partnerships and engagement in determining relevant implementation cost data, pragmatic costing tools, and reporting guides to facilitate this; and (4) present examples of multi-level cost application and provide recommendations and future directions in the field.

Multi-level framework for stakeholder groups

Incorporating a multi-level framework is useful for integrating the perspectives and priorities of the diverse set of stakeholders involved in implementation research. Ferlie and Shortell’s [13] multi-level model of system change offers a guide to consider various economic perspectives in the implementation of EBPs (see Table 1). Our adapted model includes five levels: (1) the policy and economic environment (e.g., regulatory, financial, payment regimes, and markets), under which organizations, delivery teams, and individual patients operate (e.g., patient, student, family); (2) the organization (e.g., hospital, clinic, school, etc.) that supports the development and work of teams by providing infrastructure and resources; (3) the management team: supervisory staff who are responsible for staffing and other resource allocation decisions; (4) the provider team: professional care providers or the front line staff who implement the EBP; and (5) the individual intervention participant (e.g., patient, student, family).

Table 1 Key cost considerations mapped to perspectives, priorities, and stage of implementation for different types of stakeholders

Table 1 provides a summary of key cost considerations across different perspectives and implementation phases with examples of costs and priorities. Implementation resource requirements change over time, that is, across implementation phases and vary across stakeholder groups. We offer the following school-based example related to Table 1, focusing on pre-implementation costs. A state department of education notes an increase in reports of mental health issues among students. As a result, the department leadership (policy and economic environment; outer context) mandates that districts across the state offer social and emotional learning (SEL) curricula. Such mandates can be unfunded, underfunded, or funded [16]. In this case, the decision-makers are responding to a current gap in “care” but do not have specific funds to allocate to this effort so they decide to reallocate a limited amount of funding from after-school programming to pay for the interventions only (i.e., not their implementation). Their key priority is to incentivize districts to engage in SEL instruction to reduce the statewide economic and social burden of youth mental health issues. The organizations, in this case, the school districts, must consider how to incorporate this new mandate and stay within the district budget. They consider the costs not just of the intervention, which is covered by the mandate, but the costs associated with implementation including identifying suitable curricular options, ensuring the options are compatible with current district technology and other mental health initiatives, and costs of training school staff in the new EBP. They must also consider that this mandate will result in reduced funding for their current after-school programming. The management team, in this case, the principals and other school-level leadership, must pick from a set of alternatives in consideration of the schools’ overall workload, competing demands, and consideration of opportunity cost or the (health) benefits lost from other alternatives when one is chosen [2]. In this case, the school may have to divert resources and staff time from other initiatives (e.g., after-school programs, substance use prevention programming) for enhanced SEL instruction [16]. For frontline delivery staff (e.g., teachers), time is the primary resource concern when faced with an adoption decision [1]. There will be opportunity costs related to the time needed for training, logistics, and preparation for adopting a new SEL curriculum. For the individual participant (i.e., student), there will be opportunity costs related to the foregone instruction on other health-related issues (e.g., after school time/support, substance use). This example illustrates that the decision to adopt an evidence-based practice must represent a clear and sequential win-win scenario from the perspectives of multiple stakeholder levels; failure at any level can lead to failure in EBP adoption and sustainment. Cooperation and coordination across multiple stakeholders, including those perspectives not routinely considered in economic factors related to implementation (e.g., patients and frontline providers), are required for successful implementation and sustainment.

Economic theory and multi-level stakeholder perspectives

The purpose of economic evaluation in healthcare is to maximize outcomes related to health that are subject to a set of constraints [17]. This purpose, however, raises important questions about which outcomes will be maximized, what constraints exist, and how this may differ according to various perspectives. Which costs are and benefits are most relevant, however, is inherently dependent on the perspective. Decision-makers may not recognize the impact of their decisions on costs and outcomes from different perspectives because it is not within their scope or their direct concern [17]. When decisions are made without sufficient consideration of their impacts on costs and benefits across different perspectives, they will seldom maximize overall benefits under given resource constraints.

Applying economic theory in the context of implementation research can aid in identifying and understanding multi-level perspectives and approaches to integrating these perspectives. Applying principles of economic theory can also help increase the relevance of implementation cost information used in decision-making across different stakeholder groups to ultimately enhance uptake, implementation, and sustainment of EBPs. We provide an example of applying economic theory to implementation issues using multiple stakeholder perspectives. These examples illustrate important first steps in recognizing potential areas of conflict or unintended economic effects across stakeholder groups and developing plans for resolution. Economic theory can be useful in providing guidance and support around decision-making in the implementation of health services [17], but there are currently few such examples.

Cooperative game theory

Understanding the distribution of costs and benefits across stakeholders is critical to designing implementation strategies that facilitate economic cooperation among stakeholders. There are numerous examples in implementation science when stakeholders can reduce total joint costs and realize savings by pooling resources and cooperating. Healthcare delivery organizations entering into accountable care organizations entering into bundled payment models must allocate the total costs and savings from episodes of care [18]. Alternatively, organizational managers and frontline providers can allocate time-consuming tasks according to principles that promote collaboration. In these cases, each stakeholder may face different gains and costs related to cooperating and possess different bargaining power. The question becomes how to allocate the gains (benefits and costs) from cooperation to ensure each stakeholder is incentivized to collaborate. While the role of cooperative game theory and specific methods of determining optimal costs and benefits across different stakeholder groups [19] can be useful in informing economic evaluation, the practical application of these theories in implementation science is still developing. These theoretically informed allocation methods, however, have the potential to facilitate economic collaboration across stakeholders to ultimately support sustainable economic and health outcomes that would not be possible when stakeholders act independently.

Theories from related fields such as decision sciences (making optimal choices based on available information; study of cognitive biases) and behavioral economics (concerned about the bounds of rationality for decision-makers) can also aid in understanding multiple stakeholder perspectives when considering costs and benefits of specific courses of action [20, 21]. A full review of relevant economic theoretical constructs, behavioral economics, and decision science approaches and their potential application to multi-stakeholder perspectives in implementation issues is beyond the scope of this paper. Our point is to highlight the potential role theories from other fields can play in considering multi-level stakeholder perspectives to advance the objectives of economic evaluation and to identify this as an important area for future research.

Stakeholder engagement

Stakeholder engagement is vital for closing the gap between research and practice. The implementation science field increasingly recognizes that participatory approaches are foundational to successful implementation and sustainment, including from an economic perspective [22]. Community engagement can occur along a continuum, from stakeholders as participants in research with little or no active participation to equal partners in engaged across all stages of the research process [22,23,24]. Methods such as Community-Based Participatory Research (CBPR) can increase the impact of economic evaluation of implementation by increasing both the rigor and relevance of the research; this is accomplished through engaging the stakeholders affected by and making decisions about the allocation of resources [22]. A comprehensive discussion of the wide range of stakeholder engagement methods is beyond the scope of this article, but they vary by factors such as stakeholder group, purpose, budget, time, and staffing [22,23,24]. There are even resources available to assist in the selection of engagement methods for a given implementation project [25]. And while suitable levels of partner engagement across this continuum may vary, we encourage those embarking on economic evaluation of implementation to employ methods such as CBPR. Such approaches have notable potential to accelerate bridging the gap between research and practice and enhance the likelihood of developing feasible and sustainable policies, programs, and strategies to address identified problems [22].

Understanding cost and benefit implications and perceived value from various stakeholder perspectives, from end-users to delivery staff to the larger system/environment, and what implementation cost information is most beneficial for each of these groups is critical in advancing economic analyses in implementation research. Stakeholder engagement across levels is especially critical when organizations face overwhelming competing demands and scarce resources and/or stakeholders experience disproportionate costs related to implementation, such as opportunity costs for frontline staff engaging in implementation efforts in lieu of billable patient care [9]. Assessing the perspective of different stakeholders has the potential to uncover conflicting priorities on costs and other economic issues. This can create challenges for both research reporting and practical application. To maximize the chances of successful adoption, implementation, and sustainment, it is important to understand the perspectives of all relevant stakeholders for a given implementation effort. When summarizing results for decision-makers in cases where there are discrepancies among stakeholders, we would benefit from presenting results in transparent ways that illustrate the impact of different perspectives. Often this can include sensitivity analyses based on different perspectives and assumptions related to these perspectives. Identifying optimal ways to facilitate decision-making and program/implementation strategy adaptations when perspectives differ is an area for future investigation.

Low-resource contexts

Multi-level perspectives are especially important when considering implementation efforts in low-resource contexts. For example, if the organizational leadership wishes to implement an EBP to reduce system costs (e.g., a fall prevention program) but the resources required to effectively adopt, implement, and sustain it, such as substantial time from highly paid professional staff (provider team), is not aligned with resource capacity within a setting, the likelihood of implementation success is low. Such misalignment can exacerbate rather than mitigate health disparities [26]. Settings such as community health centers, rural primary care clinics, budget-strapped school systems, inner-city community-based organizations, or low-income countries need to consider the resource constraints of the context when embarking on EBP implementation. Considering cost-related implementation issues in less-resourced contexts, whether they be organizations, communities, or countries, is understudied and requires several additional considerations.

First, while it may still be possible to implement an EBP in a low-resource setting, substantial, and potentially costly, adaptations to the intervention and/or implementation strategy are often needed. Identifying and conducting initial rapid cost estimates of lower-cost implementation strategies for an EBP is vital to addressing equity issues in implementation and sustainment. Conducting this type of rapid cost estimate requires considering the organizational/community capacity and resources from multiple perspectives (e.g., organization, management, and provider teams). Discussion of how to tailor EBPs and implementation strategies to low resource settings and changing context is a separate topic addressed in detail elsewhere [27, 28], but considering the costs and benefits in the face of competing demands and scarce resources are vital in achieving desired outcomes and mitigating health disparities.

Second, both researchers and stakeholders may consider adopting innovative approaches that substantially reduce costs commonly seen in implementation efforts in low- and middle-income countries. In some instances, resource-challenged settings have demonstrated high levels of success in implementation and health outcomes at a dramatically reduced cost. This is sometimes achieved by the use of “task-shifting” in which delivery of an EBP is accomplished by much lower level—and thus lower cost—staff members such as community health workers or peer coaches [29]. Utilizing information from innovative implementation efforts designed to reduce costs, as within lower-income countries, has the potential to guide implementation across settings to maximize health outcomes and minimize costs.

Multi-level perspectives in cost reporting

While reporting guidelines exist for economic evaluation, including the Consolidated Health Economic Evaluation Reporting Standards (CHEERS), these guides focus primarily on healthcare payer or societal perspectives [30]. While comprehensive, the societal approach aggregates costs across all perspectives and thus can limit understanding of the economic implications of implementation specific to each stakeholder group. Including multiple perspectives, while analytically burdensome, offers the opportunity to evaluate the economic consequences of different implementation decisions from various viewpoints [2]. This expanded consideration of multi-level perspectives offers a bridge to the practical application of economic research in the field of implementation science. A key next step in achieving this objective is designing implementation economic reporting guidelines based on established guides such as CHEERS for practical application in healthcare and other community organizations.

Developing reporting guides can aid in identifying areas of cooperation and conflict and in applying theories from economics, behavioral economics, and decision sciences, to address economic barriers within and across stakeholder groups. This is consistent with conclusions from a recent systematic review whereby researchers recommended standardized guidelines around costing perspectives, instrumentation, and reporting as critical next steps to advance the field [9]. At the individual participant level, for example, the priority may be to improve their health outcomes, satisfaction, or quality of life; in contrast, at the organizational level, the priority is often staying within budget. Such standardized guides can support cost reporting from multiple perspectives. When we prospectively account for multiple perspectives, we can enhance the likelihood of satisfying multiple stakeholders when determining implementation priorities; this will ultimately advance the public health impact and sustainment of evidence-based intervention(s) and related implementation strategies. Yet, it is also important that “analysts undertake ‘costing in context’” even within each analytic perspective; at a particular level, for example, the organization, the priorities may be different; that is, some organizations care about budget, others mission, and others profit [3]. The developing application of decision sciences and economic theory in implementation science, in combination with other developing areas such as mixed methods in economic evaluation of implementation, will aid in developing costing tools most relevant to the context.

Another important consideration is the reporting of uncertainties in cost estimates and cost-effectiveness/benefit analyses. Each cost and benefit input (parameter) includes specific model/structural, and methodological assumptions (e.g., costs remain stable over time) that can influence estimates and, ultimately, conclusions from economic evaluations. Various types of sensitivity analysis can be performed to investigate the influence of these uncertainties on the economic output. For instance, will small changes in an input parameter, such as provider or staff type or time, lead to a big change in cost and/or the relative benefit of adopting an innovation? These sensitivity analyses, from simple (one-way sensitivity analysis) to complex (probabilistic sensitivity analysis), can provide stakeholders with some estimates of the model parameter’s impact on the economic output. For example, increasing the labor cost of front-line staff from the base-case assumption may change an implementation plan from being cost-effective to not cost-effective. An example of sensitivity analysis and the information it can provide to stakeholders is included in our case example. Another option may be to build user-friendly model tools, such as described previously, so that different organizations could easily change some of the input assumptions to fit their specific needs; this is, again supporting the concept of “costing in context” [3]. For example, organization A could include costs related to facilitation and professional learning communities, but organization B would only include facilitation. And they could vary assumptions to fit their conditions (e.g., organization A has labor costs of $25/h, organization B has $20/h labor costs). Such tools could also include expanded outputs, and thus be more instructive for different stakeholders [31]. For example, the model output could show “hours worked” by occupation under various scenarios. You could see under “status quo,” nurses work 40 h/week, but under the “intervention (i.e., implementation strategy condition),” they work 45 h/week. In this way, the organizations would know that they either need to hire more nurses or pay overtime to keep the nurses on board with the intervention implementation. Tools to support standardization of multi-level costing approaches and reporting will also aid in building the business case for implementation science; this will facilitate harmonizing data across implementation studies and provide needed empirical evidence for the costs and benefits of implementation efforts for all stakeholders [9].

This illustration above demonstrates some of the complexities involved in data collection and reporting the results of economic analyses in implementation research. These issues are especially consequential when needing to summarize findings for either non-economist audiences or organizational decision-makers. In our experience, most decision-makers in potential adopting sites are not concerned with either the costs in a randomized research study or detailed economic issues such as marginal costs and economies of scale. Rather, they are concerned with what it would cost them to implement a program in question in their setting—e.g., replication costs. The underlying economic factors such as marginal costs would come into play when calculating and presenting options and potential adaptations (using sensitivity analyses), but decision-makers are often frustrated by many details, qualifications, or jargon from either economics or implementation science. Rather they want to know “what is the bottom line for me in this setting under our conditions.”

Case example: multi-level perspectives in implementation science

Frontline providers and organizational perspectives: implementing a fall prevention program

Frontline providers are often the ones who are most impacted daily by the implementation of new evidence-based practices. Taken as a group, front-line providers are generally interested in patient care and safety and the resources required to achieve those goals. For instance, frontline hospital staff involved in a patient safety intervention identified patient safety as a priority; but they cited a lack of proper equipment/supplies and facility issues as major operational barriers that their institutions did not prioritize when implementing a new policy [32]. Additionally, when considering the costs (which are usually predominantly dependent on labor costs) associated with frontline providers implementing and sustaining a new intervention, opportunity costs need to be estimated since frontline healthcare delivery staff negotiate the tradeoffs with productivity. Providers generally identify patient needs (e.g., preventing falls in an elderly population) as a priority. Yet, when the burden of implementing a new intervention, such as when it takes a substantial portion of their productive time, or when providers do not have access to the support and resources needed to effectively and efficiently implement the EBP, it is unlikely that implementation efforts will be successful.

In contrast, the implementation of policies and practices are usually decided at an upper administration level (organizational level) based on financial incentives. For instance, reducing fall-related injuries is a financial incentive for hospitals so an EBP to address this may be a priority at the policy and/or organizational level [33]. In theory, if fall rates decline, the cost offsets associated with implementation would be realized. However, from the frontline provider’s perspective, this would require more time invested in performing frequent fall-related interventions, which is an opportunity cost for a workforce that must balance competing priorities for their time. For example, an implementation study of a fall-prevention program involved the use of frontline nurses who received additional training and had to support the implementation of the intervention [34]. Administrators at the hospitals wanted to reduce fall rates due to financial incentives based on Medicare policy changes; implementation of the intervention focused on training nursing staff on critical thinking about fall risk and promoting hourly nursing rounds. Implementation of the fall-prevention program yielded net savings between $817,000 and $1,950,000, and the time frontline nurses spent on fall-related activities including fall-risk assessments, assisting with activities of daily living, documenting and ordering fall-related equipment and supplies, and communicating with the medical care team was reduced by 48 to 59%.

This example highlights where a solution (cooperation between the healthcare administrators and frontline nursing staff) resulted in both the hospital administrators and frontline nursing staff benefiting from the fall-prevention program that was implemented at their facilities despite potential conflicts—hospital administrators achieved net savings while frontline staff had more time to devote to other duties.

Guidelines for including multiple stakeholder perspectives

We provide specific guidelines and recommendations for cost assessment activities involving multiple stakeholders, guided by steps as outlined in Rapid-Cycle Research described by Johnson et al. in Table 2 [35]. Although the guidance in Table 2 would not always produce “rapid” results or actions, the steps are well-suited to issues raised in this paper. For example, in the pre-condition phase, a multi-level stakeholder approach would focus on identifying the stakeholder types including for example the organization, the management team, and the provider team, involved in the implementation effort and impacted by the costs of implementation. We anticipate that this approach in the pre-implementation phase will expand the inclusion of stakeholders and relevant perspectives when considering the costs and consequences of implementation efforts. We apply this approach to multiple stakeholder cost considerations for several reasons. First, this approach is designed to guide systematic and practical approaches to adopting evidence-based practices. Second, it maps on to phases commonly identified across implementation science frameworks (i.e., pre-implementation, implementation, sustainment) and thus can be applied across a variety of implementation efforts. Third, it is designed to guide solution development with the involvement of multiple levels of stakeholders. As this research is still evolving, we also provide hypotheses for future research to confirm, disconfirm, or amend these recommendations.

Table 2 Rapid-cycle approach to incorporating multi-level stakeholder economic perspectives when adopting an evidence-based practice. Adapted from Johnson et al. [35]

As the field of implementation science moves forward with building the evidence base on estimating costs and conducting comparative economic evaluation of implementation, we must specify the perspectives, and cost considerations, including the specific costs relevant to each stakeholder group as a vital first step. There can be considerable variation in costs and cost-effectiveness based on perspective and costing methodology [3], and recognizing these variations and approaches to resolving potential areas of conflict will be essential to advancing the business case for implementation in health care and public health.

Recommendations

Table 2 summarizes our key steps as identified in this approach. It is important to collect cost data using perspectives across stakeholder groups to successfully plan, implement, sustain, and spread a program. Often costs and priorities of frontline providers or individual participants are not included. As noted in the table, we hypothesize that this will result in inferior long-term results and a lack of sustainment. It is important to not just collect data from these stakeholders but to engage them using participatory approaches [22] and address conflicting economic priorities guided by economic theories. However, theories in these fields were largely developed independently of the social, cultural, and institutional embedding of relationships among stakeholder groups. Future research can build on previous lessons learned by adapting and applying economic theories, as well as related fields such as behavioral economics and decision sciences, for practical application in implementation science. The application of these ideas can aid in understanding areas of congruence and conflict across stakeholder groups and how to successfully support stakeholder cooperation. Theoretically grounded research is critical to understanding key contextual implementation questions such as “what works, when, where, and why” and how to minimize both monetary and non-monetary costs and unintended economic consequences. In addition, applying approaches such as mixed methods (see Solution testing in Table 2 [22];) to these issues will enhance the relevance of implementation economic analyses, and deepen our understanding of the complexities of multiple stakeholder perspectives to more effectively apply economic theory in designing solutions.

Working with stakeholders during pre-implementation phases to “model” approximate costs of an intervention and its associated implementation strategy (or strategy bundles as often used), and to consider their potential impact on resource demands and allocation (as well as benefits) across stakeholder levels, may ultimately save time, money, and enhance cooperation on economic issues. Adapting existing tools, such as the economic impact inventory template posited by Neumann et al., can advance the field by organizing, and presenting various types of costs and consequences from the perspectives in which they occur [2]. Facilitated engagement around implementation cost across types of stakeholders to address these differing perspectives, therefore, may enhance the likelihood of implementation success and sustainment.

Limitations

This article has several limitations. First, there is not enough literature to conduct a systematic review and there have not, to our knowledge, been comparative studies of different methods to address the issues above. Stakeholders often do not agree and engaging multiple parties in open discussions and problem-solving to come to a consensus is at present more art than science. More research is needed on the specific types and formats of cost feedback to provide to different stakeholders. Cidav et al. [36] provide examples of feedback displays on the costs of implementation strategies, but far more user testing and experimentation are needed in this area. Second, and somewhat ironically, the cost and burden of collecting cost data and providing feedback to multiple parties at multiple points in time can be considerable. This paper is restricted to issues of costs and does not comprehensively consider effectiveness, benefits, or budget impact. Cost is only part of the value equation and by itself is not sufficient to guide decision-making. Issues of conceptualizing, assessing, providing feedback on, and using benefits are covered elsewhere [2, 37] and beyond the scope of this article. Finally, the rapid and frequent adaptations that are often made in implementation projects make costing tricky, as well as attributing costs to specific implementation strategies when ever-changing bundles of strategies are used. These issues are discussed in Quanbeck et al. (Quanbeck A, Cidav Z, Eisman A, Garner B, Glasgow R, Wagner T. Approaches for estimating the cost of implementation strategies, submitted for review) paper included in this special collection.

Other potentially useful theories for advancing multiple stakeholder perspectives in economic evaluation in implementation science, such as those from behavioral economics and decision sciences, were not fully addressed here. Zimmerman et al., for example, have leveraged the concept of bounded rationality as a foundation on which they have built comprehensive simulation models to aid healthcare stakeholders in providing treatment services for veterans with posttraumatic stress disorder [38]. These and other research efforts [12, 39] aim to demonstrate the relevance and practical utility of behavioral economic concepts such as bounded rationality for implementation research and, in particular when considering the decision-making approaches and perspectives of different stakeholders.

Conclusions and future directions

There has been a recent surge of publications related to the collection of costs in implementation science [9, 14, 31, 36], and emerging consensus regarding the importance of economic influences on implementation outcomes (see other articles in this collection). This paper contributes to economic evaluation by discussing a range of issues involved in considering implementation costs in the real-world from the perspective of multiple stakeholders to advance both the science and practice of implementation. Specifying the key types of stakeholders will aid in putting the deceptively simple advice to “consider multiple perspectives” into practice and inform both researchers and implementers. This paper not only discusses how we can apply economic theory to take steps beyond identifying costs relevant to each stakeholder group but also obtain a deeper understanding of differing perspectives, nonmonetary influences on decision-making, how to recognize (and build upon) areas of overlap, and rectify areas of conflict.

We conclude that there are compelling reasons to collect, report, and understand costs from the perspectives of different stakeholders. Failure to do so may result in (1) lack of program adoption, especially if organizational and policy perspectives are not considered; (2) poor reach to individual participants, especially those at high risk and with few resources; (3) poor quality implementation, especially if not considering the perspective of both supervisory and delivery staff; and (4) poor sustainment when not considering all these perspectives.

Our recommendations are based upon the synthesis of existing literature, input from the diverse disciplines represented on our authorship team, various economic theories, applied experience collecting cost data, and implementation science frameworks, but not on a systematic review or meta-analyses.

The recommendations we offer above and in Table 2 should be tested and revised as necessary based upon their potential usefulness. As with non-economic applications, understanding and working with stakeholders before, during, and after program implementation is essential. We hypothesize that collecting, estimating, reporting, and discussing costs with different types of stakeholders will enhance the adoption, implementation, and sustainment of EBPs. The specific and most appropriate activities, collection strategies, feedback and reporting methods, timing, and frequency of assessments are yet to be determined and will likely vary across settings and contextual factors. Future research should test different methods of cost collection—evaluating their pros and cons and including assessment of the costs and burden of collecting and using cost data by different stakeholders. There is both an important need and a great opportunity to advance implementation science by including and testing different pragmatic approaches to costing from the perspective of different stakeholders.

Availability of data and materials

Not applicable.

References

  1. 1.

    Jones Rhodes WC, Ritzwoller DP, Glasgow RE. Stakeholder perspectives on costs and resource expenditures: tools for addressing economic issues most relevant to patients, providers, and clinics. Transl Behav Med. 2018;8(5):675–82. https://doi.org/10.1093/tbm/ibx003.

    Article  PubMed  Google Scholar 

  2. 2.

    Neumann P, Sanders G, Basu A, Brock D, Feeny D, Krahn M, et al. Recommendations on perspectives for the reference case. In: Neumann P, Sanders G, Russell L, Siegel J, Ganiats T, editors. Cost-Effectiveness in Health and Medicine. 2nd ed. New York: Oxford University Press; 2016. p. 67–73. https://doi.org/10.1093/acprof:oso/9780190492939.003.0003.

    Chapter  Google Scholar 

  3. 3.

    Drummond M, Sculpher M, Claxton, K, Stoddart G, Torrance G. Methods for the economic evaluation of health care programmes. Oxford; New York: Oxford University Press; 2015. 379 p. (Oxford medical publications).

  4. 4.

    Sanders G, Neumann P, Basu A, Brock D, Feeny D, Krahn M, et al. Recommendations for conduct, methodological practices, and reporting of cost-effectiveness analyses: second panel on cost-effectiveness in health and medicine. JAMA - Journal of the American Medical Association. 2016;316(10):1093–103. https://doi.org/10.1001/jama.2016.12195.

    Article  PubMed  Google Scholar 

  5. 5.

    Eisman A, Kilbourne A, Dopp A, Saldana L, Eisenberg D. Economic evaluation in implementation science: making the business case for implementation strategies. Psychiatry Research. 2020;1:283(112433).

    Google Scholar 

  6. 6.

    Ribisl K, Leeman J, Glasser A. Pricing health behavior interventions to promote adoption: lessons from the marketing and business literature. Am J Prev Med. 2014;46(6):653–9. https://doi.org/10.1016/j.amepre.2014.02.008.

    Article  PubMed  PubMed Central  Google Scholar 

  7. 7.

    Chambers D, Glasgow R, Stange K. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8(1):117. https://doi.org/10.1186/1748-5908-8-117.

    Article  PubMed  PubMed Central  Google Scholar 

  8. 8.

    U.S. Department of Education. Prevalence and implementation fidelity of research-based prevention programs in public schools: final report. Washington, D.C.: U.S. Department of Education, Office of Planning, Evaluation and Policy Development, Policy and Programs Study Service; 2011. Report No.: ED-00-CO-0119.

  9. 9.

    Bowser D, Henry B, McCollister K. Cost analysis in implementation studies of evidence-based practices for mental health and substance use disorders: a systematic review. Implementation Sci. 2021;16(1):26. https://doi.org/10.1186/s13012-021-01094-3.

    Article  Google Scholar 

  10. 10.

    Peek CJ, Glasgow RE, Stange KC, Klesges LM, Purcell EP, Kessler RS. The 5 R’s: an emerging bold standard for conducting relevant research in a changing world. Ann Fam Med. 2014;12(5):447–55. https://doi.org/10.1370/afm.1688.

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  11. 11.

    Raghavan R. The role of economic evaluation in dissemination and implementation research. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and implementation research in health: translating science to practice. Oxford; New York: Oxford University Press; 2018. p. 89–106.

  12. 12.

    Quanbeck A. Using stakeholder values to promote implementation of an evidence-based mobile health intervention for addiction treatment in primary care settings. JMIR mHealth and uHealth. 2019 07;7(6):e13301.

  13. 13.

    Ferlie EB, Shortell SM. Improving the quality of health care in the United Kingdom and the United States: a framework for change. The Milbank Quarterly. 2001;79(2):281–315. https://doi.org/10.1111/1468-0009.00206.

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  14. 14.

    Wagner T, Yoon J, Jacobs J, So A, Kilbourne M, Yu W, et al. Estimating costs of an implementation intervention. Med Decis Making. 2020;40(8):959–67. https://doi.org/10.1177/0272989X20960455.

    Article  PubMed  Google Scholar 

  15. 15.

    Gamlen C, Clancy T, Moengen D, Rauen J. Measuring return on investment in complex healthcare systems. J Nurs Adm. 2012;42(7–8):353–5. https://doi.org/10.1097/NNA.0b013e3182619165.

    Article  PubMed  Google Scholar 

  16. 16.

    Lyon A, Bruns E. From evidence to impact: joining our best school mental health practices with our best implementation strategies. School Mental Health. 2019;11(1):106–114, 1, https://doi.org/10.1007/s12310-018-09306-w.

  17. 17.

    Meltzer D, Basu A, Sculpher M. Theoretical foundations of cost-effectiveness in health and medicine. In: Neumann P, Sanders G, Russell L, Siegel J, Ganiats T, editors. Cost-Effectiveness in Health and Medicine. 2nd ed. New York: Oxford University Press; 2016. p. 39–65. https://doi.org/10.1093/acprof:oso/9780190492939.003.0002.

    Chapter  Google Scholar 

  18. 18.

    Kolker A. The concept of the Shapley value and the cost allocation between cooperating participants. In: Encyclopedia of Information Science and Technology, Fourth Edition. IGI Global; 2018. p. 2095–2107.

  19. 19.

    Shapley L. A value for n-person games. In: Kuhn H, Tucker A, editors. Contributions to the Theory of Games, Volume II. Princeton University Press; 1953 [cited 2020 Dec 7]. p. 307–18. Available from: https://www.degruyter.com/princetonup/view/book/9781400881970/10.1515/9781400881970-018.xml

  20. 20.

    Reed D, Niileksela C, Kaplan B. Behavioral economics. Behav Anal Pract. 2013;6(1):34–54. https://doi.org/10.1007/BF03391790.

    Article  PubMed  PubMed Central  Google Scholar 

  21. 21.

    Center for Health Decision Science. What is decision science?. Center for Health Decision Science, Harvard T.H. Chan School of Public Health. 2017 [cited 2021 Apr 5]. Available from: https://chds.hsph.harvard.edu/approaches/what-is-decision-science/

  22. 22.

    Minkler M, Salavatore A, Chang C. Participatory approaches for study design and analysis in dissemination and implementation research. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and implementation research in health: Translating science to practice. Oxford University Press; 2017.

  23. 23.

    Balazs C, Morello-Frosch R. The three Rs: how community-based participatory research strengthens the rigor, relevance, and reach of science. Environmental Justice. 2013;6(1):9–16. https://doi.org/10.1089/env.2012.0017.

    Article  Google Scholar 

  24. 24.

    Clinical and Translational Science Awards Consortium. Principles of community engagement. 2nd ed. Washington, DC: National Institutes of Health, Centers for Disease Control and Prevention, Agency for Toxic Substances and Disease Registry; 2011.

  25. 25.

    Data Science to Patient Value Initiative, University of Colorado Anschutz Medical Campus. Stakeholder Engagement Selection Tool [Internet]. Dissemination, implementation, communication and engagement: a guide for health researchers. [cited 2021 Jul 1]. Available from: https://dicemethods.org/Tool

  26. 26.

    Barrera M, Berkel C, Castro F. Directions for the advancement of culturally adapted preventive interventions: local adaptations, engagement, and sustainability. Prevention Science. 2017;18(6):640–8. https://doi.org/10.1007/s11121-016-0705-9.

    Article  PubMed  PubMed Central  Google Scholar 

  27. 27.

    Baumann A, Cabassa L. Reframing implementation science to address inequities in healthcare delivery. BMC Health Serv Res. 2020;20(190):9.

    Google Scholar 

  28. 28.

    Baumann A, Cabassa L, Stirman S. Adaptation in dissemination and implementation science. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and implementation research in health: translating science to practice. 2nd ed. New York: Oxford University Press; 2018. p. 285–300.

    Google Scholar 

  29. 29.

    Javadi D, Feldhaus I, Mancuso A, Ghaffar A. Applying systems thinking to task shifting for mental health using lay providers: a review of the evidence. Glob Ment Health (Camb). 2017;4:e14.

  30. 30.

    Husereau D, Drummond M, Petrou S, Carswell C, Moher D, Greenberg D, et al. Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement. BMJ. 2013 [cited 2020 Sep 29];346. Available from: https://www.bmj.com/content/346/bmj.f1049

  31. 31.

    Dopp A, Mundey P, Beasley L, Silovsky J, Eisenberg D. Mixed-method approaches to strengthen economic evaluations in implementation research. Implement Sci. 2019 [cited 2019 Mar 5];14(2). Available from: https://implementationscience.biomedcentral.com/articles/10.1186/s13012-018-0850-6

  32. 32.

    Tucker A, Singer S, Hayes J, Falwell A. Front-line staff perspectives on opportunities for improving the safety and efficiency of hospital work systems. Health Services Research. 2008;43(5 Pt 2):1807–29. https://doi.org/10.1111/j.1475-6773.2008.00868.x.

    Article  PubMed  PubMed Central  Google Scholar 

  33. 33.

    Fehlberg E, Lucero R, Weaver M, McDaniel A, Chandler AM, Richey P, et al. Impact of the CMS no-pay policy on hospital-acquired fall prevention related practice patterns. Innov Aging. 2017;1(3).

  34. 34.

    Nuckols T, Needleman J, Grogan T, Liang L, Worobel-Luk P, Anderson L, et al. Clinical effectiveness and cost of a hospital-based fall prevention intervention: the importance of time nurses spend on the front line of implementation. J Nurs Administration. 2017;47(11):571–80. https://doi.org/10.1097/NNA.0000000000000545.

    Article  Google Scholar 

  35. 35.

    Johnson K, Gustafon D, Ewigman B, Provist L, Roper R. Using rapid-cycle research to reach goals: awareness, assessment, adaptation, acceleration. Rockville: Agency for Healthcare Research and Quality U.S. Department of Health and Human Services; 2015. Report No.: AHRQ Publication No. 15-0036.

  36. 36.

    Cidav Z, Mandell D, Pyne J, Beidas R, Curran G, Marcus S. A pragmatic method for costing implementation strategies using time-driven activity-based costing. Implement Sci. 2020 [cited 2020 Jun 15];15(1). Available from: https://implementationscience.biomedcentral.com/articles/10.1186/s13012-020-00993-1

  37. 37.

    Gold M. Cost-effectiveness in health and medicine /. New York: Oxford University Press; 1996. p. 425.

    Google Scholar 

  38. 38.

    Zimmerman L, Lounsbury D, Rosen C, Kimerling R, Trafton J, Lindley S. Participatory system dynamics modeling: increasing stakeholder engagement and precision to improve implementation planning in systems. Administration Policy Mental Health Mental Health Serv Res. 2016;1:43.

    Google Scholar 

  39. 39.

    Beidas R, Maclean J, Fishman J, Dorsey S, Schoenwald S, Mandell D, et al. A randomized trial to identify accurate and cost-effective fidelity measurement methods for cognitive-behavioral therapy: project FACTS study protocol. BMC Psychiatry. 2016;16:1–10.

    Article  Google Scholar 

  40. 40.

    Rabin B, Brownson R. Terminology for dissemination and implementation research. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and implementation research in health: translating science to practice. New York: Oxford University Press; 2018. p. 19–45.

    Google Scholar 

  41. 41.

    Rogers E. Diffusion of innovations[Internet]. New York: Free Press; 2003. xxi, 551 p. p. Available from:http://mirlyn.lib.umich.edu/Record/004335364%20CN%20%20-%20HM%20101%20.R72%202003

  42. 42.

    Proctor E, Silmere H,Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementationresearch: Conceptual distinctions, measurement challenges, and research agenda.Adm Policy Mental Health Mental Health Serv Res.2011;38(2):65–76.

  43. 43.

    Shields G, Elvidge J. Challenges in synthesising cost-effectiveness estimates. Syst Rev. 2020 [cited 2021 Apr 5];9. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7727163/

  44. 44.

    Neumann P. Costing and perspective in published cost-effectiveness analysis. Medical Care. 2009;47(7_Supplement_1):S28–32.

  45. 45.

    Rabin BA, Brownson RC, Kerner JF, Glasgow RE. Methodologic challenges in disseminating evidence-based interventions to promote physical activity. Am J Prevent Med. 2006;31(4 SUPPL.):24–34.

    Article  Google Scholar 

  46. 46.

    Proctor E, Powell B, McMillen J. Implementation strategies: recommendations for specifying andreporting. Implement Sci. 2013;8(1):139.

    Article  Google Scholar 

  47. 47.

    Curran G, Bauer M, Mittman B, Pyne J, Stetler C. Effectiveness-implementation hybrid designs. Med Care. 2012;50(3):217–26.

  48. 48.

    Crosswaite C, Curtice L.Disseminating research results-the challenge of bridging the gap between health research and health action. Health Promot Int. 1994;9(4):289–96.

  49. 49.

    Teddie C, Tashakkori A. Foundations of mixed methods research : integrating quantitative andqualitative approaches in the social and behavioral sciences. Thousand Oaks: Sage Publications; 2009.

    Google Scholar 

  50. 50.

    Muennig P, Bounthavong M. Cost-effectiveness analyses in health: a practical approach. 3rd ed. San Francisco: Jossey-Bass; 2016. xvi, 266 p. p.

  51. 51.

    Shediac-Rizkallah MC, Bone LR. Planning for the sustainability of community-based health programs:conceptual frameworks and future directions for research, practice and policy.Health Educ Res. 1998 Mar;13(1):87–108.

  52. 52.

    Johnson K, Hays C, Center H, Daley C. Building capacity and sustainable prevention innovations: a sustainability planning model. Eval Prog Plann. 2004 May;27(2):135–49.

Download references

Acknowledgements

We would like to acknowledge the Economics and Implementation Science Workgroup for their formative input and internal review process.

Trial registration

Not applicable.

Funding

Preparation of this manuscript was supported in part by K01DA044279, PI: Eisman; Center grant P50CA244688 (Glasgow).

Author information

Affiliations

Authors

Contributions

The Implementation Science Workgroup leadership conceived the commentary subject matter (Gila Neta, Todd Wagner, and Heather Gold); AE, RG, and AQ led the formative stages around substantive content; all authors (AE, RG, AQ, LP, MB) participated in drafting the manuscript, edited, read and approved the submitted manuscript.

Corresponding author

Correspondence to Andria B. Eisman.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Glossary

Adaptation

Adaptation is defined as “the degree to which an evidence-based intervention is changed or modified by a user during adoption and implementation to suit the needs of the setting or to improve the fit to local conditions [40, 41].” 

Adoption

Adoption is the initial decision or intention of an organization or a community to employ or try an evidence-based intervention [42].

Analytic perspective

Identifies parameters in terms of which costs (inputs) are included in an economic evaluation [43]; the societal perspective includes all costs and benefits, regardless of who pays them or receives the benefits [44].

Decision science

Decision science is the collection of quantitative techniques used to inform decision-making at the individual and population levels. It includes decision analysis, risk analysis, cost-benefit, and cost-effectiveness analysis, constrained optimization, simulation modeling, and behavioral decision theory, as well as parts of operations research, microeconomics, statistical inference, management control, cognitive and social psychology, and computer science. By focusing on decisions as the unit of analysis, decision science provides a unique framework for understanding public health problems, and for improving policies to address those problems [21].

Economic evaluation

The comparative analysis of alternative courses of action in terms of both their costs and consequences [3].

Evidence-based intervention

The focus of D&I activities are on interventions with demonstrated efficacy and effectiveness (i.e., evidence-based). Interventions broadly ca include programs, practices, policies, and guidelines [45].

Implementation strategies

Implementation strategies refer to methods, techniques, activities, and resources to enhance the adoption, integration, and sustainment of evidence-based interventions into clinical or community settings [46, 47].

Implementation costs

Implementation cost is defined as the “cost impact of an implementation effort.” Implementation costs depend on intervention costs, the implementation strategy deployed, and the setting(s) in which the intervention is implemented [42].

Innovation

Innovation refers to a practice, idea or program that is perceived as new by potential adopters. Some use this interchangeably with evidence-based intervention [48].

Mixed methods

Mixed methods designs involve the collection and analysis quantitative and qualitative data in a single study to answer research questions using a convergent, sequential or conversion approaches. Mixed methods designs are appropriate to answer complex research questions [49].

Opportunity costs

The value of the next best investment forgone [50].

Sustainability

Sustainability is the extent to which an evidence-based intervention delivers its intended effects over a period of time after external support from the outside agency or funder is finished; three indicators of sustainability include (1) maintenance of program benefits, (2) institutionalization of the program, and (3) capacity building in the recipient organization or community [51]

Sustainment

Activities that build resources and capacity for continued use of an intervention or program within a clinic or community [52].

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Eisman, A.B., Quanbeck, A., Bounthavong, M. et al. Implementation science issues in understanding, collecting, and using cost estimates: a multi-stakeholder perspective. Implementation Sci 16, 75 (2021). https://doi.org/10.1186/s13012-021-01143-x

Download citation

Keywords

  • Implementation
  • Costs
  • Stakeholder
  • Perspective
  • Decision-making
  • Context
  • Dissemination
  • Health economics
  • Sustainment