Skip to main content

Mixed-method approaches to strengthen economic evaluations in implementation research

Abstract

Background

Guidance from economic evaluations on which implementation strategies represent the best return on investment will be critical to advancing the Triple Aim of health care: improving patient care and population health while minimizing per-capita cost. The results of traditional (quantitative) economic evaluations are limited by a remaining “qualitative residual” of contextual information and stakeholders perspectives, which cannot be captured by monetary values alone and is particularly prevalent in implementation science research. The emergence of qualitative methods for economic evaluation offers a promising solution.

Main body

To maximize the contributions of economic evaluations to implementation science, we recommend that researchers embrace a mixed-methods research agenda that merges traditional quantitative approaches with innovative, contextually grounded qualitative methods. Such studies are exceedingly rare at present. To assist implementation scientists in making use of mixed methods in this research context, we present an adapted taxonomy of mixed-method studies relevant to economic evaluation. We then illustrate the application of mixed methods in a recently completed cost-effectiveness evaluation, making use of an adapted version of reporting standards for economic evaluations.

Conclusions

By incorporating qualitative methods, implementation researchers can enrich their economic evaluations with detailed, context-specific information that tells the full story of the costs and impacts of implementation. We end by providing suggestions for building a research agenda in mixed-method economic evaluation, along with more resources and training to support investigators who wish to answer our call to action.

Peer Review reports

Background

In order to advance the three components of the Triple Aim in health care [1], there is a critical need for research that can identify the most impactful, efficient distribution of resources within health care organizations and systems. To date, implementation scientists have primarily focused on strategies to improve two components of the Triple Aim, (1) patient care (through uptake and adoption of evidence-based practices) and (2) population health (through scaling and sustainment of those practices), with scant attention to the third component: (3) minimizing per-capita cost [2, 3]. This results in a limited perspective because implementation efforts generally result in additional costs to agencies, which are often a critical barrier to implementation and sustainment of evidence-based practices [4, 5]. Guidance from economic evaluations on which implementation strategies represent the best return on investment will be critical to advancing the field. Herein, we argue that incorporation of mixed (i.e., qualitative and quantitative) methods in these evaluations will be necessary to maximize their contribution to implementation science.

Economic evaluation of implementation

Traditional methods of economic evaluation in health care (see [6,7,8]) compare incremental differences in costs and outcomes—i.e., health-related efficacy or effectiveness data—among discrete clinical practices (e.g., two interventions in a clinical trial). Specific methods include cost analysis, which compares costs only; cost-effectiveness analysis, which compares costs to changes in a quantitative measure of health-related outcomes (or standardized outcomes, such as quality-adjusted or disability-adjusted life years, in the case of cost-utility analysis); benefit-cost analysis, which compares costs to monetized benefits of health-related outcomes (i.e., dollars to dollars); and budget impact analysis, which examines the consequences of an intervention on the budget of the agency that delivers it. These methods are all highly technical and quantitative, and have been applied most often to data from randomized clinical trials.

These traditional methods are certainly informative to implementation efforts (e.g., allow for consideration of an intervention’s economic effects when deciding whether to implement it). However, the methods become more complex and challenging when extended to implementation research. Such extensions involve comparing the costs of different implementation strategies to the outcomes (e.g., fidelity to or acceptability of the evidence-based practice; clinical symptoms) resulting from those strategies. As shown in the equation below, which compares the incremental costs and outcomes of two different implementation strategies, both the intervention and implementation strategy chosen must be considered when evaluating economic impact:

$$ \left({\mathrm{Cost}}_{\mathrm{Intervention}}+{\mathrm{Cost}}_{\mathrm{ImplementationStrategyA}}\right)-\left({\mathrm{Cost}}_{\mathrm{Intervention}}+{\mathrm{Cost}}_{\mathrm{ImplementationStrategyB}}\right) $$

vs.

$$ {\mathrm{Outcome}}_{\mathrm{InterventionW}/\mathrm{ImplementationStrategyA}}-{\mathrm{Outcome}}_{\mathrm{InterventionW}/\mathrm{ImplementationStrategyB}} $$

An additional challenge in economic evaluation is the “qualitative residual” [9] that often remains on the results of quantitative economic evaluations because of their limited ability to capture the contexts and stakeholders’ perspectives within which monetary values can be interpreted. This limitation is especially salient in implementation research because the outcomes [10] and costs [11, 12] are dependent on the context in which implementation takes place. Thus, in the equation above, each component carries its own qualitative residual. For example, the decisions made by clinic staff during implementation can influence intervention costs (e.g., by adding or dropping components), implementation costs (e.g., by allocating personnel and resources to implementation activities), and outcomes (e.g., poorly functioning clinics may need to expend more resources to achieve a desired level of implementation quality).

Qualitative methods for economic evaluation

Recognizing limitations to exclusive use of quantitative economic evaluations, an increasing number of scholars outside of implementation science have advocated for the incorporation of qualitative data into those evaluations [13,14,15]. The traditions of qualitative research methods are as rich and varied as those of quantitative methods (see, e.g., [16, 17]). Techniques for data collection include individual interviews and focus groups designed to gather participant perspectives on a topic; site visits to observe where participants live, work, or play; review of records and other documents to glean insights about activities; or ethnographic field work in which the researcher is embedded within a community while collecting detailed observations. Analytic techniques apply a variety of perspectives to analyzing the words, themes, and language conventions that make up qualitative data, such as content analysis, thematic analysis, grounded theory, and in-depth case studies. The common thread through all qualitative methods is an emphasis on achieving a depth of understanding (often with a small sample of participants or groups) that captures the perspectives, experiences, or environments of certain individuals or groups.

Formal qualitative research occupies a small but growing place within the field of economics [18, 19]. Certainly, carefully done economic evaluations often contain informal qualitative components, even if those components are not identified as such. For example, the evaluators might develop data collection instruments based on careful discussions with practitioners or base their interpretations of the data on their familiarity with the organizations and settings involved. However, it is rare for these methods to be incorporated thoroughly into the analytical plan and process or to be formally documented. Moreover, few of the qualitative economic studies to date have focused on the economics of health care, let alone implementation specifically. Given that qualitative methods are ideally suited to provide the “thick description” [20] of contextual information needed for high-quality studies of implementation, rigorous application of qualitative methods provides rich information critical for economic evaluation that is unattainable by traditional, quantitative methods in isolation. Therefore, as described more next, a mixed-method approach is better suited for implementation research. More broadly, such an approach is also compatible with calls for an ethical imperative to include participants’ voices, using qualitative methods, in theoretical and empirical representations of the economic forces that shape their lives (including in health care) [19, 21].

A call for mixed-method economic evaluation

Despite their numerous strengths, we do not suggest that qualitative methods of economic evaluation should replace quantitative methods—which are well-established, rigorous, and have a long and impactful history of use. Instead, we recommend that implementation scientists begin to develop a research agenda around mixed-method economic evaluation. Mixed methods refer to a tradition that combines qualitative and quantitative data to address the same (or closely related) research questions [22, 23]. Combining the complementary strengths and perspectives of each research tradition allows for a better understanding of a research topic than either approach in isolation [23, 24] and provides an opportunity to derive emergent insights by merging multiple perspectives [25]. For these reasons, mixed methods are an essential component of “gold standard” studies in implementation science [10, 26].

Unfortunately, to date, virtually no research has combined quantitative and qualitative approaches in the economic evaluation of health services. We only located 165 results in a PubMed search on August 31, 2018, for the following terms: “((“economic eval*” OR “economic analysis” OR “cost-effect*” OR “cost-benefit” OR “cost-utility” OR “cost effect*” OR “cost benefit” OR “cost utility”) AND (“mixed method*” OR “mixed-method*”)) AND health.” A search with the same terms in EconLit returned only four results. Of the subset of these results that actually described an economic evaluation, most reported on a purely quantitative economic evaluation in the context of a larger mixed-method study (i.e., qualitative data were collected but were not used to answer questions about economic costs and impacts). For examples, see Heller et al.’s [27] evaluation of a management program for type I diabetes and Rise et al.’s [28] protocol for a randomized trial evaluating a modification to an occupational rehabilitation program. We only located one study, a benefit-cost analysis of the Australian acute care accreditation program [29], that explicitly integrated qualitative (focus groups, expert panels) and quantitative (cost information collected via surveys and semi-structured interviews, indicators of patient safety and quality of care extracted from administrative datasets) methods to identify, quantify, and validate the costs and benefits of accreditation.

Given a limited primary empirical literature from which to draw, our recommendations for mixed-method economic evaluations instead come from conceptual and methodological literature related to these topics, as well as our own experience conducting qualitative and mixed-method research. That experience includes an economic evaluation, described in more detail later, which is currently undergoing peer review.

Taxonomy of mixed-method economic evaluations

Palinkas and colleagues [10] previously developed a useful taxonomy that describes the arrangements of qualitative (“qual”) and quantitative (“quant”) methods within mixed-method implementation research studies. The major features of that taxonomy include the structure (e.g., sequential vs. simultaneous data collection and analysis; primary emphasis on qual methods, quant methods, or both equally), function (i.e., what is achieved by combining qual and quant data), and process (i.e., how they are combined) of mixed methods within the study. Table 1 presents an adapted version of that taxonomy that is specific to mixed-method economic evaluations. Our intent in creating this taxonomy was to aid implementation researchers in (a) conceptualizing an agenda of mixed-methods economic evaluation that spans the full breadth of potential mixed methods, and (b) selecting the appropriate study design when planning a given mixed-method economic evaluation.

Table 1 Taxonomy of mixed method designs for economic evaluation

Our adapted taxonomy differs from the original in three key ways. First, because economic evaluation is ultimately focused on quant questions (e.g., amount of $) and hypothesis testing, a mixed-method economic evaluation is best described as a “pure” (i.e., equal emphasis on qual and quant) or “quant-dominant mixed” study [23]. Therefore, we excluded structural categories described by Palinkas et al. [10] in which qual methods were dominant. Second, we modified the definitions from the original taxonomy to include language and examples specific to economic evaluation. Note that in this taxonomy, qual data refers to information about the types of costs and impacts for an implementation activity, the contextual factors that influenced those costs and impacts, and their relative importance or priority. In contrast, quant data include any numeric information on implementation-related costs and impacts, such as monetary amounts, utilization frequency counts, or scores on a quantitative measure of symptoms. Finally, for simplicity of presentation, we combined the function and process dimensions of the taxonomy because they are closely aligned (i.e., the original taxonomy [10] described which functions are achieved through each process in addition to separately defining the functions).

Illustrative example

Recently, a subset of the present authors completed a cost-effectiveness evaluation of the implementation of Problematic Sexual Behavior–Cognitive-Behavioral Therapy (PSB-CBT) at six provider agencies nationwide (Dopp A, Mundey P, Silovsky J, Hunter M, Slemaker A.: Economic value of community-based services for problematic sexual behaviors in youth: a mixed-method cost-effectiveness analysis, under review). PSB-CBT is a community-based, group-format treatment that has demonstrated significant effects on problematic sexual behavior in youth ages 7 to 14 (see [30]). This study provides a unique opportunity to illustrate, in detail, the mechanics of a mixed-method economic evaluation in an implementation study. Table 2 describes the study using the Consolidated Health Economic Evaluation Reporting Standards [31], and includes information about how mixed methods informed each item of the evaluation. We omitted some items because they were not relevant to our study or were non-methodological, and we only briefly mentioned items that did not incorporate qual data in our study. For example, our six participating agencies (“Setting and Location”) were selected based on funder decisions (whereas a mixed-method study with a sampling function might use qualitative data to select agencies). Moreover, it is beyond the scope of this article to provide guidance on all technical aspects of economic evaluation (e.g., perspective, discount rate), and many excellent resources already exist for quantitative economic evaluations in health care [6,7,8].

Table 2 Illustrative example: mixed-method cost-effectiveness evaluation of problematic sexual behavior-cognitive behavioral therapy

Our evaluation benefitted from the use of mixed methods in two key ways. First, we took a Qual ➔ QUAN approach (development function, connect-initiate process; see Table 1) to create a survey of costs incurred during implementation of PSB-CBT. We developed the survey items based on qual data from interviews with 59 therapists, administrators, and external stakeholders from the agencies implementing PSB-CBT, ensuring broad coverage of costs that included staff activities, training expenses, number of youth served, and proportion of activities billed to various sources. Similarly, we planned several sensitivity analyses—which examine the influence of variation in model parameters on the findings of an economic evaluation [32]—using qual data about agency-specific contextual factors that affected implementation. For example, interviewees at an agency that regularly provided intensive individual services to youth in the PSB-CBT program noted higher costs, so we examined the impact of providing such services on cost-effectiveness and found that PSB-CBT was no longer cost-effective under those conditions. Of course, use of qualitative data in our evaluation design also introduced new challenges. In particular, interviewees described in detail how avoided expenses from alternatives to PSB-CBT (e.g., residential treatment, juvenile detention) were a key benefit of the program. However, when we asked program administrators during the cost survey to quantify savings from such avoided expenses, we found that administrators were unable to provide specific monetary values from the relevant agencies who would incur said costs. Potential cost-savings at the community rather than individual agency level complicates economic evaluation, with findings greatly restricted when relying on quan data alone.

Second, we used a QUAN + Qual approach to analyze and interpret our findings regarding the cost-effectiveness of PSB-CBT. Specifically, we used qual themes to validate conclusions from our quantitative cost-effectiveness ratios, representing a convergence function and merge process (see Table 1). This proved critical because, in the absence of quantitative data on the value of PSB-CBT outcomes (vs. alternative placements), we had to derive a quantitative threshold for cost-effectiveness from existing literature (detailed in, Dopp A, Mundey P, Silovsky J, Hunter M, Slemaker A.: Economic value of community-based services for problematic sexual behaviors in youth: a mixed-method cost-effectiveness analysis, under review). That threshold suggested that costs of up to $8333 per one-unit improvement in youth symptoms were cost-effective, but we needed a way to validate the threshold. We therefore examined themes from the qualitative interviews, in which respondents indicated that PSB-CBT added considerable value to families and society by providing a vital service that kept youth with PSB in the community, enhanced public safety, and was less expensive than traditional services for this population. These findings were consistent with the quantitative results, in which PSB-CBT was cost-effective under almost all conditions, thus allowing us to triangulate the conclusion that PSB-CBT has a valuable impact that is worth the cost of the program.

Conclusions

In this article, we have recommended that implementation scientists embrace a mixed-method research agenda for economic evaluation, provided a taxonomy of mixed-method studies relevant to economic evaluation, and illustrated the application (and reporting) of these methods by presenting a recently completed study. Through incorporation of qualitative methods, implementation researchers can strengthen their economic evaluations with rich, contextually grounded stories that facilitate the interpretation (and actionability) of their results.

Of course, many challenges and unanswered questions remain in this area of research. We hope that other implementation researchers will use the proposed taxonomy and reporting standards to generate a more robust empirical research base. We also encourage those researchers to build on and modify the taxonomy and reporting standards; our example study had some notable limitations (e.g., lack of a comparison group) that may have led to concomitant limitations in the tools that we have developed thus far. Rigorous engagement with the proposed research agenda by many experts—working across a variety of implementation strategies, settings, and target evidence-based practices—will be necessary to reach scientific consensus on best practices in mixed-method economic evaluation. Across these various research efforts, examples of questions that could advance implementation science (while providing opportunities to explore and further refine mixed methods for economic evaluation) include:

  1. 1.

    What are the full economic costs and consequences of alternative implementation strategies and health services? These types of questions will extend traditional lines of economic evaluation research into the implementation science space. As in evaluation of other implementation outcomes, it will be critical to position findings within the contextual information and stakeholder perspectives provided by qualitative methods. Such information could be particularly valuable for understanding the economics of long-term sustainment of evidence-based practices following initial implementation, given the complex and dynamically changing factors involved.

  2. 2.

    How do the economic costs and consequences of implementation vary as a function of systemic and contextual factors (e.g., size of the organization, implementation climate)? This could be an excellent opportunity for simulation modeling and systems science approaches to economic evaluation [33], in which qualitative data could richly inform specification of the quantitative models.

  3. 3.

    What are the major sources of uncertainty when estimating the economic impact of implementation efforts, and how should those sources be accounted for? For instance, in implementation studies, it may be unclear to what extent start-up costs (e.g., training) will recur in the future (e.g., to provide refreshers to current personnel or to train new providers when turnover occurs). Another source of complexity is what perspective the economic evaluation should take (i.e., to whom are the costs and impacts incurred?), given the numerous stakeholder perspectives often represented in implementation efforts (e.g., when a mental health organization pays to implement an intervention that produces benefits in another sector, such as child welfare or criminal justice). The various scenarios or values to be represented in a sensitivity analysis [32] can be difficult to determine without a firm understanding of the qualitative context.

  4. 4.

    What is a reasonable return on investment for implementation and service outcomes; e.g., How much should we spend to increase fidelity to PSB-CBT (implementation outcome) or decrease average wait time to receive care (service outcome)? Such thresholds are not currently available in health economics literature (which has focused on returns on investment for clinical outcomes), presenting challenges for implementation researchers (see e.g., [34]). Incorporation of qualitative data into the development of such thresholds would likely strengthen their usefulness and credibility.

  5. 5.

    What are the best ways to address ethical issues introduced by using mixed methods in economic evaluations of implementation efforts? Inclusion of participant perspectives via qualitative methods certainly advances principles of justice and respect for persons [19, 21], but the level of detail captured by qualitative data also results in increased risks to participants [35]. For instance, collection of detailed qualitative information about implementation costs could threaten confidentiality by increasing the likelihood that participants are individually identifiable, as well as increase the potential harms of a breach in confidentiality (e.g., proprietary information could result in financial or legal ramifications in the event of a breach). It will be important to consider what types of training and guidelines will be necessary for researchers with a background in economic evaluation to learn and use established ethical practices for qualitative research (see [16, 17]). We anticipate that relevant topics might include confidentiality (e.g., de-identifying narrative data using pseudonyms and generic language) and data integrity (e.g., ensuring complete [non-selective] reporting of data, reporting quotations in context, determining when enough data have been collected to draw robust conclusions), among others.

We close by acknowledging that our proposed agenda will require researchers to continue pushing the boundaries of the interdisciplinary team science approaches that are already common—yet remain challenging—in health services research [10, 26, 36]. Mixed-method economic evaluations will require health services researchers to develop understanding of economic evaluation [2, 3] and for economists to develop understanding of qualitative and mixed methods [18, 19]. Thus, we hope to see an increase in resources (e.g., toolkits, formal coursework, mentored research programs) that support the development of researchers who combine qualitative and quantitative perspectives in economic evaluations. Such training would add to the growing plethora of implementation science resource initiatives [37]—paving the way for more innovative, contextually valid, and impactful studies to advance all aspects of the Triple Aim in health care [1].

Abbreviations

CER:

Cost-effectiveness ratio

CHEERS:

Consolidated Health Economic Evaluation Reporting Standards

OJJDP:

Office of Juvenile Justice and Delinquency Prevention

PSB-CBT:

Problematic Sexual Behavior–Cognitive-Behavioral Therapy

QUAL or Qual:

Qualitative

QUANT or Quant:

Quantitative

References

  1. Berwick D, Nolan T, Whittington J. The triple aim: care, cost, and quality. Health Aff. 2008; https://doi.org/10.1377/hlthaff.27.3.759.

  2. Hoomans T, Severens JL. Economic evaluation of implementation strategies in health care. Implement Sci. 2014; https://doi.org/10.1186/s13012-014-0168-y.

  3. Raghavan R. The role of economic evaluation in dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and implementation research in health: translating science to practice. New York: Oxford University Press; 2018. p. 89–106.

    Google Scholar 

  4. Bond GR, Drake RE, McHugo GJ, Peterson AE, Jones AM, Williams J. Long-term sustainability of evidence-based practices in community mental health agencies. Adm Policy Ment Hlth. 2014; https://doi.org/10.1007/s10488-012-0461-5.

  5. Roundfield KD, Lang JM. Costs to community mental health agencies to sustain an evidence-based practice. Psychiatr Serv. 2017; https://doi.org/10.1176/appi.ps.201600193.

  6. Drummond MF, Sculpher MJ, Claxton K, Stoddart GL, Torrance GW. Methods for the economic evaluation of health care programmes. 4th ed. Oxford: Oxford University Press; 2015.

    Google Scholar 

  7. Gold MR, Siegel JE, Russell LB, Weinstein MC. Cost-effectiveness in health and medicine. New York: Oxford University Press; 1996.

    Google Scholar 

  8. Yates BT. Cost-inclusive evaluation: a banquet of approaches for including costs, benefits, and cost-effectiveness and cost-benefit analysis in your next evaluation. Eval Program Plann. 2009; https://doi.org/10.1016/j.evalprogplan.2008.08.007.

  9. Onwuegbuzie AJ, Teddlie C. A framework for analyzing data in mixed methods research. In: Tashakkori AJ, Teddlie C, editors. Handbook of mixed methods in social and behavioral research. London: Sage Publications; 2003. p. 351–83.

    Google Scholar 

  10. Palinkas LA, Aarons GA, Horwitz S, Chamberlain P, Hurlburt M, Landsverk J. Mixed method designs in implementation research. Adm Policy Ment Hlth. 2011; https://doi.org/10.1007/s10488-010-0314-z.

  11. Saldana S, Chamberlain P, Bradford WD, Campbell M, Landsverk J. The cost of implementing new strategies (COINS): a method for mapping implementation resources using the stages of implementation completion. Child Youth Serv Rev. 2014; https://doi.org/10.1016/j.childyouth.2013.10.006.

  12. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: The RE-AIM framework. Am J Public Health. 1999; https://doi.org/10.2105/AJPH.89.9.1322.

  13. Johnson KL, Yorkston KM, Klasner ER, Kuehn CM, Johnson E, Amtmann D. The cost and benefits of employment: a qualitative study of experiences of persons with multiple sclerosis. Arch Phys Med Rehab. 2004; https://doi.org/10.1016/S0003-9993(03)00614-2.

  14. Rogers PJ, Stevens K, Boymal J. Qualitative cost-benefit evaluation of complex, emergent programs. Eval Program Plann. 2009; https://doi.org/10.1016/j.evalprogplan.2008.08.005.

  15. Ziller A, Phibbs P. Integrating social impacts into cost-benefit analysis: a participative method: case study: the NSW area assistance scheme. Impact Assess Proj A. 2012; https://doi.org/10.3152/14715460378176636.

  16. Bazeley P. Qualitative data analysis: practical strategies. Thousand Oaks, CA: Sage; 2013.

    Google Scholar 

  17. Patton MQ. Qualitative research & evaluation methods. 3rd ed. Thousand Oaks, CA: Sage; 2002.

    Google Scholar 

  18. Jefferson T, Austen S, Sharp R, Ong R, Lewin G, Adams V. Mixed-methods research: what’s in it for economists? Econ Labour Relat Re. 2014; https://doi.org/10.1177/1035304614530819.

  19. Starr MA. Qualitative and mixed-methods research in economics: surprising growth, promising future. J Economic Systems. 2014; https://doi.org/10.1111/joes.12004.

  20. Geertz C. The interpretation of cultures. New York: Basic Books; 2000.

    Google Scholar 

  21. Ruccio D. Economic representations: academic and everyday. London: Routledge; 2008.

    Google Scholar 

  22. Creswell JW, Plano Clark VL. Designing and conducting mixed methods research. 2nd ed. Thousand Oaks, CA: Sage; 2011.

    Google Scholar 

  23. Johnson R, Onwuegbusie AJ, Turner LA. Toward a definition of mixed methods research. J Mix Method Res. 2007; https://doi.org/10.1177/1558689806298224.

  24. Shneerson CL, Gale NK. Using mixed methods to identify and answer clinically relevant research questions. Qual Health Res. 2015; https://doi.org/10.1177/1049732315580107.

  25. Wenger-Trayner B, Wenger-Trayner E, Cameron J, Eryigit-Madzwamuse S, Hart A. Boundaries and boundary objects: an evaluation framework for mixed methods research. J Mix Method Res. 2017; https://doi.org/10.1177/1558689817732225.

  26. Green CA, Duan N, Gibbons RD, Hoagwood KE, Palinkas LA, Wisdom JP. Approaches to mixed methods dissemination and implementation research: methods, strengths, caveats, and opportunities. Adm Policy Ment Hlth. 2015; https://doi.org/10.1007/s10488-014-0552-6.

  27. Heller S, Lawton J, Amiel S, Cooke D, Mansell P, Brennan A, et al. Improving management of type 1 diabetes in the UK: the dose adjustment for normal eating (DAFNE) programme as a research test-bed. In: Programme Grants for applied research; 2014. https://doi.org/10.3310/pgfar02050.

    Chapter  Google Scholar 

  28. Rise MB, Skagseth M, Klevanger NE, Aasdahl L, Borchgrevink P, Jensen C, et al. Design of a study evaluating the effects, health economics, and stakeholder perspectives of a multi-component occupational rehabilitation program with an added workplace intervention: a study protocol. BMC Public Health. 2018; https://doi.org/10.1186/s12889-018-5130-5.

  29. Mumford V, Greenfield D, Hinchcliff R, Moldovan M, Forde K, Westbrook JI, et al. Economic evaluation of Australian acute care accreditation (ACCREDIT-CBA (acute)): study protocol for a mixed method research project. BMJ Open. 2013; https://doi.org/10.1136/bmjopen-2012-002381.

  30. Silovsky JF, Hunter M, Taylor EK. Early intervention for youth with problematic sexual behavior. Journal of Sex Aggress. 2018; https://doi.org/10.1080/13552600.2018.1507487.

  31. Husereau D, Drummond M, Stavros P, Carswell C, Moher D, Greenberg D, et al. Consolidated health economic evaluation reporting standards (CHEERS) statement. Cost Effectiveness and Resource Allocation. 2013; https://doi.org/10.1136/bmj.f1049.

  32. Briggs AH, Gray AM. Handling uncertainty in economic evaluations of healthcare interventions. Brit Med J. 1999; https://doi.org/10.1136/bmj.319.7210.635.

  33. Luke DA, Morshed AB, McKay VR, Combs TB. Systems science methods in dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and implementation research in health: translating science to practice. New York: Oxford University Press; 2018. p. 157–74.

    Google Scholar 

  34. Dopp AR, Hanson RF, Saunders BE, Dismuke CE, Moreland AD. Community-based implementation of trauma-focused interventions for youth: economic impact of the learning collaborative model. Psychol Serv. 2017; https://doi.org/10.1037/ser0000131.

  35. Corti L, Day A, Backhouse G. Confidentiality and informed consent: issues for consideration in the preservation of and provision of access to qualitative data archives. Forum Qual Soc Res. 2000; https://doi.org/10.17169/fqs-1.3.1024.

  36. O’Cathain A, Murphy E, Nicholl J. Multidisciplinary, interdisciplinary, or dysfunctional? Team working in mixed-methods research. Qual Health Res. 2008; https://doi.org/10.1177/1049732308325535.

  37. Darnell D, Dorsey CN, Melvin A, Chi J, Lyon AR, Lewis CC. A content analysis of dissemination and implementation science resource initiatives: what types of resources do they offer to advance the field? Implement Sci. 2017; https://doi.org/10.1186/s13012-017-0673-x.

Download references

Acknowledgements

The authors wish to thank individuals and agencies involved in our collaborative partnership among professionals in University-based research and training and technical assistant teams, federal and local agencies, expert consultants, and community-based service agencies. They are too numerous to mention by name but we note their dedication to efforts to systematically develop, implement, evaluate, and expand effective intervention programs for families impacted by youth with problematic sexual behaviors.

Funding

The example study described in this paper was supported by Grants 2010-WP-BX-K062, 2013-MU-MU-K102, and 2016-MU-MU-K053 (PI: Silovsky) awarded by the Office of Juvenile Justice and Delinquency Prevention, Office of Justice Programs, U.S. Department of Justice. The opinions, findings, and conclusions or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect those of the Department of Justice. The funding body did not play any role in the design of the research described in this manuscript nor in writing the manuscript.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on request.

Author information

Authors and Affiliations

Authors

Contributions

AD conceptualized this manuscript, wrote the first draft, and adapted the study taxonomy and reporting standards described within the manuscript. PM contributed to AD’s conceptualization and initial drafting of the manuscript. PM, LB, JS, and DE reviewed drafts of the manuscript and contributed additional conceptualization and writing to the final product. All authors reviewed and approved the submitted version of the manuscript.

Corresponding author

Correspondence to Alex R. Dopp.

Ethics declarations

Ethics approval and consent to participate

Data reported in this manuscript were originally collected for the purposes of program evaluation and quality improvement. The need for informed consent of participants was waived because agencies participating in the evaluation provided deidentified data. All procedures and measures for the evaluation were approved by the participating agencies and the Institutional Review Board of the University of Oklahoma Health Sciences Center (IRB #4041).

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dopp, A.R., Mundey, P., Beasley, L.O. et al. Mixed-method approaches to strengthen economic evaluations in implementation research. Implementation Sci 14, 2 (2019). https://doi.org/10.1186/s13012-018-0850-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-018-0850-6

Keywords