Skip to main content

How can a behavioral economics lens contribute to implementation science?

Abstract

Background

Implementation science in health is an interdisciplinary field with an emphasis on supporting behavior change required when clinicians and other actors implement evidence-based practices within organizational constraints. Behavioral economics has emerged in parallel and works towards developing realistic models of how humans behave and categorizes a wide range of features of choices that can influence behavior. We argue that implementation science can be enhanced by the incorporation of approaches from behavioral economics.

Main body

First, we provide a general overview of implementation science and ways in which implementation science has been limited to date. Second, we review principles of behavioral economics and describe how concepts from BE have been successfully applied to healthcare including nudges deployed in the electronic health record. For example, de-implementation of low-value prescribing has been supported by changing the default in the electronic health record. We then describe what a behavioral economics lens offers to existing implementation science theories, models and frameworks, including rich and realistic models of human behavior, additional research methods such as pre-mortems and behavioral design, and low-cost and scalable implementation strategies. We argue that insights from behavioral economics can guide the design of implementation strategies and the interpretation of implementation studies. Key objections to incorporating behavioral economics are addressed, including concerns about sustainment and at what level the strategies work.

Conclusion

Scholars should consider augmenting implementation science theories, models, and frameworks with relevant insights from behavioral economics. By drawing on these additional insights, implementation scientists have the potential to boost efforts to expand the provision and availability of high quality care.

Peer Review reports

Introduction

Human behavior is the last mile challenge to many seemingly intractable problems in improving the human condition. Many scientific discoveries are unevenly accessed and delivered due to an underappreciation for how social and behavioral factors might interface with the implementation of these discoveries. For example, during the COVID-19 pandemic, mRNA vaccines were developed and available within one year and yet too little consideration was paid to implementation and human behavior, resulting in uneven implementation and stark inequities [1]. Similarly, as it becomes clear that some prescribing practices are low-value, the de-implementation process requires changing clinician behavior [2, 3]. As new screening modalities emerge that can prevent the onset of disease, it is essential that clinicians refer those patients most likely to benefit [4].

Implementation science has emerged as a convergence field, bringing together multiple disciplines to close the gap between what we know and what we do – or in other words, focused on behavior change of healthcare actors within organizational constraints, including clinicians, managers, funders, and health service users [5]. Implementation science has made great advances in coalescing as a field over the past two decades, drawing on a range of disciplines including organizational theory, human factors, improvement science, and adult learning theory [6]. Early work in implementation science characterized barriers and facilitators to implementation when initial efforts to change behavior within organizational constraints were often unsuccessful. The next generation of studies explored cross-sectional relationships between determinants hypothesized in leading conceptual frameworks. The most recent work in the field tests the comparative effectiveness of implementation strategies [7]. However, limitations of the current paradigm include an overreliance on education-focused implementation strategies such as training [8], approaches that are designed for the ideals of human behavior and do not take into account knowable and predictable patterns of human decision making, and the use of costly resource-intensive strategies that are difficult to scale.

The field of behavioral economics has developed in parallel and has also focused empirical inquiry on understanding human behaviors within various settings, including the healthcare environment. Behavioral economics offers a paradigm shift in how social scientists, including psychologists and economists, understand human behavior and decision-making [9]. Through 50 years of research, novel findings in psychology and economics have allowed behavioral economists to identify and categorize factors that drive human behavior in surprising but replicable ways, disrupting existing scholarly consensus about how people make decisions and introducing a new set of frameworks for researchers and policymakers. Importantly, behavioral economics offers simple and low-cost approaches that build on evidence of how humans make decisions. Specific concepts from behavioral economics that have been applied to change health behavior include promoting vaccination [10], de-prescribing [11], and screening [12]. For example, informing members of the public that a personal COVID-19 vaccine was ready specifically for them was effective at increasing vaccination uptake [10]. Default settings in electronic health records have changed prescribing behavior [2]. And recent scholarship has highlighted the scope for improving cancer screening pathways by removing unnecessary friction and paperwork, sometimes described as ‘sludge’ [4, 13].

While focused on similar outcomes, the two fields have not been explicitly woven together, thus offering an opportunity for synergizing and maximizing impact. Systemically incorporating the behavioral economics perspective into implementation science is an important opportunity to advance the field. We propose that approaches from behavioral economics can allow us to understand behaviors in a more complete and nuanced manner (See note Footnote 1). More specifically, we argue that insights from behavioral economics can guide the design of implementation strategies and the interpretation of implementation studies for the advancement of the field.

Behavioral economics explains behavior in a more realistic and practical way

Classical economic theories of human behavior assume humans maximize utility – in other words appraising all potential actions and selecting the one perceived to be the most beneficial – but evidence from behavioral economics has increasingly revealed that humans do not maximize utility when making decisions [9, 14]. That is to say, given human decision-making processes, all people working in healthcare are “boundedly rational” in predictable ways [15]. Dual Process Theory is one important way of understanding this phenomenon, holding that people make decisions using two systems. System 1 is fast, intuitive, and automatic, and prioritizes efficiency. It relies on heuristics, or mental shortcuts, and biases which are commonly repeated patterns of responses [9, 16]. System 1 runs without us noticing but is prone to errors because its heuristics and biases are generally useful but not tailored to every situation. System 2 is slow, conscious, and effortful. It analyses problems logically to avoid pitfalls, but tires quickly [9].

The predictable thinking patterns or biases which result in bounded rationality are increasingly well-described and replicated. They impact almost every part of life, including many areas of healthcare. Present bias, for example, means that people prefer immediate pleasure compared with delayed pleasure; we tend to accept more pain later rather than a little immediate pain [17]. This might mean patients avoid vaccinations now despite risking hospitalization later. It could mean clinicians delay learning about a technology which will improve the efficiency of their practice because they anticipate that the initial process of learning will be tiring and stressful. Due to commission bias, people tend to choose to act rather than not act [18]. This could lead to doctors recommending cancer screening for patients at low risk of cancer because it feels like they are taking action whereas creating cancer scares, unnecessary biopsies, and private pain or loss of function is harmful and low-value care. Under the availability heuristic people consider outcomes more likely if they can readily bring them to mind; for example, people can easily imagine an airplane crash but less readily imagine chronic lung disease, and so they exaggerate their likelihood of dying in the former and underrate the risk of the latter [19]. This misperception might lead to reduced engagement with preventive efforts like smoking cessation support [20]. The same effect might make doctors excessively risk-averse by endowing rare adverse outcomes with an outsized impact on decision-making.

The extent to which heuristics and biases impact behaviors is mediated by the environment. Behavioral economists use the term choice architecture for the whole range of features of the environment that shape behaviors [21] including default options, positive or negative framing, reminders, social factors (who is watching and what others are doing), cognitive factors (other concurrent decisions and ease of access to data), and uncertainty (regarding information or regarding outcomes) [14, 22, 23]. While a decision maker who is maximizing utility would optimally pursue their preferences irrespective of the choice architecture, human behavior is frequently influenced in predictable ways when choice architecture interacts with heuristics and biases.

This rich understanding of how humans think, feel, behave, and make decisions can allow choice architects to help people make better decisions. One way of doing this is through “nudging” or altering the choice architecture to help people make better decisions (although the implications of behavioral economics go much further) [21, 24, 25]. Autoenrollment into pension savings is one example of a nudge [26]. Over the last decade the number of people in the UK saving for retirement has increased significantly because, rather than being asked to opt-in to pensions, they were assumed to want to save, automatically signed into the program, and given the chance to opt-out [27]. This changed outcomes because people making decisions with system 1 tend to stick with the default: at first the default was non-enrollment, now the default is enrollment. In some jurisdictions organ donation has undergone a similar change. [28]. Choice architecture exists whether we intentionally design it or not. For example, if non-enrollment in organ donation is the default, it is still a default. It is incumbent on people designing choice architectures to consider whether predictable patterns of human behavior due to heuristics and biases will interact with features of the choice architecture in ways that help or hurt people.

Another way of leveraging insights from behavioral economics to help people make better decisions is to remove features of choices which are cognitively or emotionally draining, sometimes called “sludge” [24]. The term “cognitive misers” is sometimes used to describe how humans have a universal tendency to make decisions that conserve cognitive and emotional energy, and thus sludge can stop people making choices they would otherwise want to make [29, 30]. Examples of sludge include when a second-hand car dealer makes a customer sign a disclaimer before allowing them to decline an overpriced insurance add-on or when it is difficult to access naloxone in the event of opioid overdose. Since 2016, the state of California has made it legal for pharmacists to dispense naloxone without a prescription [31]. Desludging healthcare by making systems quick and straightforward to use can help clinicians, healthcare administrators, and patients make better decisions.

The findings of behavioral economics suggest that policy should draw on evidence of how humans actually behave. In his book Inside The Nudge Unit, David Halpern emphasized the importance of a more realistic model of behavior [32].

"A practical approach to government, or business, based on a realistic model of people would be messier than that of traditional economics or law. It would need to reflect the complexity of the human mind – what we do well, and what we don’t. It would imply thinking of cognitive capacities as wonderful, but precious resources. When we design services and products, we would need to be respectful of this reality, and remember that people have generally got better things to do than wade through bureaucracy or the puzzling ‘rationality’ of the state or big business. We would have to design everything we do around people, not expect people to have to redesign their lives around us.” [32]

Halpern put this approach into practice at the Behavioural Insights Team in the UK Civil Service, but this description also indicates how implementation science could be informed by behavioral economics. Applying a behavioral economics lens entails drawing on empirical evidence to ensure implementation science is informed by an awareness of what the human mind does well, where it struggles and tires, and how people respond to different choice architectures. It means considering how humans really behave within the context of implementation, not how we hope they will behave.

General insights from behavioral economics on decision-making have been largely overlooked in implementation science

Implementation scientists have largely overlooked the impact of bounded rationality on decision-making [8, 33]. Clinicians are expected to change behavior as new practices or policies are introduced within their organizational contexts and patients are expected to adhere to relevant advice or medication. In other words, the fundamental assumption is that knowledge is a major mechanism of behavior change. However scholarship demonstrates that knowledge may be necessary but is rarely sufficient for behavior change [8]. Therefore, most implementation endeavors stand to benefit from considering a behavioral economics lens, which could include defaults, cognitive bandwidth, motivated reasoning, or any of the other areas where research in behavioral economics is relevant for understanding the challenges of implementing new practices that require behavior change. For example, altering electronic prescribing systems such that the preferred prescribing option is selected automatically is using a default to increase evidence-based prescribing [34].

Several well-known implementation science models draw on social-cognitive theory, such as Theoretical Domains Framework (TDF), Capability Opportunity Motivation Behaviour (COM-B) and Clinical Performance Feedback Intervention Theory (CP-FIT). These theories incorporate rich conceptions of human behavior and decision-making, but do not explicitly move beyond the assumption that we are mindful and deliberate in all our actions [35, 36]. The full depth of relevant findings about human behavior, particularly how heuristics and biases influence behavior in surprising but replicable ways, has not yet been systematically integrated with implementation science. This gap presents an opportunity to design strategies that account for how humans actually behave rather than how we hope they will behave, and to enrich our understanding of the challenges of changing behavior and influencing decision-making as new practices are implemented [35,36,37]. Emerging work in this space has not yet been fully embedded within the general implementation science approach but represents a promising direction for forward movement [38, 39]. Table 1 outlines how the behavioral economics lens can inform implementation science.

Table 1 What does a Behavioral Economics Lens offer to IS?

The behavioral economics lens can guide the design of implementation strategies

To move towards integrating behavioral economics approaches in implementation science, we must evaluate the role bounded rationality and behavioral factors play in implementation to optimally design implementation strategies. By capitalizing on evidence of the impact of heuristics and biases and the potential impact of choice architecture, the behavioral economics lens can help with better design of implementation strategies to change behavior at multiple levels, including the individual, team, profession, and organization. Over recent years implementation scientists have adopted systematic approaches to selecting implementation strategies, drawing on logic models and taxonomies of implementation strategies, along with implementation mapping [40]. These approaches have strengthened the field by targeting specific contextual barriers identified by constituents. Behavioral design takes this a step further by incorporating the biases and heuristics which can influence behavior but may remain outside of constituents’ conscious awareness [41]. Two behavioral economic typologies, EAST and MINDSPACE offer guides for researchers considering incorporating heuristics and biases into implementation strategy design [14, 42].

EAST incorporates four domains where behavioral insights apply: easiness, attractiveness, social factors, and timing. Considering these domains can guide scholars designing implementation approaches to consider whether their approach capitalizes on recognized ways of changing human behavior. These four elements can be used in different ways, for example, men’s health interventions like Movember have drawn on social factors by prompting conversations about men’s health, whereas the Do It For Babydog vaccination campaign in West Virginia gave children chance to win a party for their whole school, creating a different kind of social impetus [43, 44]. Conversely, considering the four EAST domains at the design stage can bring to attention overlooked features. This can be seen when implementation strategies include incentives: deferred incentives are less powerful than immediate incentives, so consideration of the timing domain could help an implementation designer consider ways of offering a reward alongside the behavior rather than waiting until later.

Whereas EAST provides factors to consider, MINDSPACE gives a specific list of strategies. This more extensive typology lists simple nudges, such as incentives, norms, and defaults, that implementation scientists may consider when looking for ways to boost behavior change. In applying a behavioral economics lens, the implementation scientist might review the MINDSPACE framework to find an appropriate technique. The implementation designer may decide that it would be appropriate to include “Commitments” by inviting participants to make a public promise to change practice. Alternatively, the pre-mortem approach can be used to leverage the power of prospective hindsight, where team members imagine an implementation effort has already failed and discuss all the causes of the failure, to make potential mitigation targets more salient[45, 46]. In each case, EAST helpfully draws attention to domains where the behavioral economics lens might apply and the MINDSPACE framework offers specific strategies which could be incorporated.

One study that illustrates how to apply these concepts to the design of implementation strategies was conducted by Patel and colleagues [2]. A change in the choice architecture where the default became generic medications vs. the previous default (i.e. name-brand medications) resulted in a 5 percentage-point greater increase in default prescriptions. Large changes were noted in prescriptions where there was little clinical difference between preparations but smaller changes in prescriptions such as thyroxine and, to a lesser extent, anti-epileptics, demonstrating that prescribers overrode defaults and maintained agency where there was a clinical indication [47]. In this case, Patel et al. focused on easiness from the EAST framework, and used the default approach from the MINDSPACE repertoire.

The behavioral economics lens can inform the interpretation of implementation trials

Post-implementation evaluation can also be informed by the behavioral economics lens. Making sense of the outcomes of studies of different implementation strategies is not straightforward, particularly when unexpected results arise [48]. Evaluation studies often incorporate mixed qualitative and quantitative methodologies which benefit from a theoretical framework, and the behavioral economics lens provides one such framework. Post-trial evaluations perform the crucial function of appraising the replicability of findings and may note extraneous factors which influenced the study, such as changes to the organizational or national context. Similarly, surprising findings may be explained by behavioral factors. For example, an otherwise effective implementation may be undermined by a default policy or norm; identifying such behavioral factors allows implementation scientists to offer more insightful advice for future studies.

Purtle et al.’s recent dissemination study illustrates this approach [49]. The authors explored whether state legislators were more likely to open emails which contained local economic data about the impact of adverse childhood experiences (ACEs). Purtle et al. found in secondary analysis that Democratic legislators were more likely to open emails labelled as containing useful economic data about ACEs, whereas Republican legislators were no more likely to open those emails than emails offering no economic data. The authors briefly discussed motivated reasoning, a concept incorporated into behavioral economics from the social cognition literature, describing how a desire to hold certain beliefs influences the way people seek out and evaluate sources of information [49, 50] (see note Footnote 2). Information avoidance, a concept closely related to motivated reasoning, helps make sense of their surprising findings: people tend to avoid finding out facts that threaten their existing pre-existing beliefs [51]. If Republicans are in general wary of government intervention in family life, they may avoid information which implies government should act around ACEs. Purtle et al.’s use of the behavioral economics lens is innovative and provides a model for others trying to make sense of unexpected findings.

Another recent study by Glidewell et al. evaluated the success of several strategies for changing practice in UK primary care. Their interpretation of the results was implicitly in keeping with common findings from behavioral economics [52]. The authors found strong evidence of information avoidance in their less successful implementations; where searches of patient lists were expected to create an unmanageable amount of work, administrative staff admitted not conducting the searches. Glidewell et al. appear to have drawn on insights from behavioral economics in their design and interpretation of this paper illustrating the usefulness of the behavioral economics lens. However, they did not cite the behavioral economics literature or use terminology from behavioral economics. While these omissions are reasonable when communicating only with other implementation scientists, they offer a missed opportunity to use shared keywords that would allow behavioral economists to benefit from frontline applications of these theories. Behavioral economics brings together ideas such as social comparison and information avoidance from other disciplines such as social psychology; using the same terminology can also enrich implementation science and provide useful explanations for better understanding findings that benefit from a behavioral lens.

Objections to behavioral economics

While behavioral economics offers new insights to boost implementation science, it is important to highlight potential limitations to this approach.

While some implementation efforts such as those around prescribing change are located close to the individual choices of clinicians and are highly amenable to a wide range of behavioral interventions, many implementation efforts relate to team or organizational behavior. Behavioral economics is also relevant within the multilevel nature of implementation science, as many of the core ideas in behavioral economics relate to the way groups and teams work and how colleagues relate to each other. Common approaches to categorizing behavioral biases highlight the importance of attempts to change behavior by making desired actions social and taking into consideration how group norms can be presented [14]. Behavioral economists have also explored how bounded rationality alters group decision-making, team coordination and colleague effort levels. Across teamworking tasks, coordination tasks, and competitive tasks, behavioral economists have found surprising results which could not be explained by traditional economic models [53]. For example, when colleagues were randomly paired for an effort task they tended to perform equally, specifically because the lower performing participant worked harder to match the higher performer. However, more work is needed to apply these concepts to the team and organizational levels.

Similarly, the importance of sustainment is increasingly recognized by implementation science [54]. Interventions from behavioral economics vary by duration of behavior change. For example, removing sludge or setting default options can influence behaviors repeatedly [14]. Other behavioral insights such as social comparison nudges may only influence behavior while the intervention is actively being managed, just as an implementation strategy with a didactic education component is difficult to sustain as new staff arrive unless sustainment has been actively managed. Implementation scientists drawing on the behavioral economics lens should consider which elements match the duration of behavior change sought.

Finally, some behavioral economists have recently warned that prioritizing nudging over all other interventions could distract policymakers’ attention from the underlying drivers of adverse policy outcomes such as structural inequities [55]. Scholars have noted the importance of using integrated strategies which combine nudges with political and social approaches to improve outcomes broadly and equitably [56]. In public health contexts such an approach could entail combining individual behavior change strategies with advocacy for investment in equitable access. It is important that implementation scientists remain alert to outer context factors when drawing on the behavioral economics lens and we anticipate that the field is well positioned to avoid this potential limitation of behavioral economics [57].

Conclusion

Implementation science has rapidly attained influence and respect because of its positioning as a convergence field bringing together transdisciplinary approaches to closing the gap between what we know and what we do. Notably, the behavioral economics lens has featured little within the development of the field. We have argued that implementation science can now be enhanced by the incorporation of approaches from behavioral economics, particularly by considering heuristics and behavioral biases that shape decision-making and behavior and by leveraging these known, predictable patterns to design choice architecture within the context of implementation strategy design.

This opportunity to integrate a behavioral economics lens into implementation science merits continued attention and consideration. If otherwise well-designed implementation strategies are undermined by behavioral economic phenomena there is a risk that the field of implementation science will have reduced impact. Just as the AACTT framework (action, actor, context, target, and time) has helped implementation scientists report intended behavior change mechanisms with greater clarity we suggest that explicit description of the changes to choice architecture and their anticipated effect on behavior would help other implementation scientists to evaluate and replicate implementation approaches [58]. To make this easier for other implementation scientists, scholars may consider augmenting implementation science frameworks and taxonomies with relevant behavioral insights. With these additional frameworks, implementation scientists have the potential to supercharge efforts to expand the provision and availability of evidence-based practices. [59]

Availability of data and materials

Not applicable.

Notes

  1. Implementation science often focuses on clinician behavior change but we recognize implementation science is increasingly salient to how behavior change among a broad range of individuals and groups, including healthcare leaders, policymakers, and even patients, influences the extent to which evidence-based practices and policies are put into practice. For simplicity, we will generally refer to clinician behavior change in this article, but the principles and recommendations can be relevant to other actors’ behaviors [59].

  2. It is important to note that Social Cognition is a blend of social and cognitive psychology and is different from Social-Cognitive approaches discussed above.

References

  1. Paltiel AD, et al. Clinical Outcomes Of A COVID-19 Vaccine: Implementation Over Efficacy. Health Aff (Millwood). 2021;40(1):42–52.

    Article  PubMed  Google Scholar 

  2. Patel MS, et al. Using default options within the electronic health record to increase the prescribing of generic-equivalent medications: a quasi-experimental study. Ann Intern Med. 2014;161(10 Suppl):S44–52.

    Article  PubMed  Google Scholar 

  3. Kelley MA, et al. Association of Fatal Overdose Notification Letters With Prescription of Benzodiazepines: Secondary Analysis of a Randomized Clinical Trial. JAMA Intern Med. 2022;182(10):1099–100.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Rockwell MS, et al. A “sludge audit” for health system colorectal cancer screening services. Am J Manag Care. 2023;29(7):e222–8.

    Article  PubMed  Google Scholar 

  5. Bauer MS, et al. An introduction to implementation science for the non-specialist. BMC Psychol. 2015;3(1):32.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Beidas RS, et al. Promises and pitfalls in implementation science from the perspective of US-based researchers: learning from a pre-mortem. Implement Sci. 2022;17(1):55.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Williams NJ, Beidas RS. Annual Research Review: The state of implementation science in child psychology and psychiatry: a review and suggestions to advance the field. J Child Psychol Psychiatry. 2019;60(4):430–50.

    Article  PubMed  Google Scholar 

  8. Beidas RS, Buttenheim AM, Mandell DS. Transforming Mental Health Care Delivery Through Implementation Science and Behavioral Economics. JAMA Psychiat. 2021;78(9):941–2.

    Article  Google Scholar 

  9. Kahneman D, Thinking F, Slow. New York. NY: Farrar, Straus and Giroux; 2013.

    Google Scholar 

  10. Milkman KL, et al. A 680,000-person megastudy of nudges to encourage vaccination in pharmacies. Proc Natl Acad Sci USA. 2022;119(6):e2115126119.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  11. Doctor JN, et al. Opioid prescribing decreases after learning of a patient’s fatal overdose. Science. 2018;361(6402):588–90.

    Article  CAS  PubMed  Google Scholar 

  12. Lucas T, et al. Message Framing for Men? Gender Moderated Effects of Culturally Targeted Message Framing on Colorectal Cancer Screening Receptivity among African Americans. Psychol Men Masc. 2023;24(2):103–12.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Hodson N. De-sludging healthcare systems. BMJ. 2023;383: p2916.

    Article  Google Scholar 

  14. Dolan P, Hallsworth M, Halpern D, King D, Vlaev I, editors. MINDSPACE Influencing behaviour through public policy, C.O.a.T.I.f. Government. London, UK; 2010.

  15. Simon HA, Models of man, social and rational: Mathematical essays on rational human behavior. New York City. NY: Wiley; 1957.

  16. Wilke A, Mata R. Cognitive Bias in Encyclopedia of Human Behavior (Second Edition). V.S. Ramachandran, editor.  Academic Press: Cambridge; 2012:531–535.

  17. Cheung SL Tymula A, W.X. Quasi-Hyperbolic Present Bias: A Meta-Analysis, in IZA institute of labour economics discussion papers. Bonn: 2021.

  18. Song F, Shou Y, Olney J, et al. Moral judgments under uncertainty: risk, ambiguity and commission bias. Curr Psychol. 2024;43(11):9793–804.

    Article  Google Scholar 

  19. Folkes VS. The Availability Heuristic and Perceived Risk. Journal of Consumer Research. 1988;15(1):13–23.

    Article  Google Scholar 

  20. Masiero M, Lucchiari C, Pravettoni G. Personal fable: optimistic bias in cigarette smokers. Int J High Risk Behav Addict. 2015;4(1): e20939.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Thaler RH, Sunstein CR, Nudge: Improving decisions about health, wealth, and happiness. New Haven. CT: Yale University Press; 2008.

    Google Scholar 

  22. Mullainathan S, Shafir E, Scarcity: Why Having Too Little Means So Much. New York. NY: Times Books; 2013.

    Google Scholar 

  23. Kahneman D, Tversky A. Prospect Theory: An Analysis of Decision under Risk. Econometrica. 1979;47(2):263–92.

    Article  Google Scholar 

  24. Thaler RH. Nudge, Not Sludge. Science. 2018;361(6401):431.

    Article  CAS  PubMed  Google Scholar 

  25. Hodson N. Cancer screening and accessibility bias: People want screening when informed it saves no lives. Behavioural Public Policy. 2023;7(1):157–69.

    Article  Google Scholar 

  26. Thaler RH, Benartzi S. Save More TomorrowTM: Using Behavioral Economics to Increase Employee Saving. Journal of Political Economy. 2004;112(S1):S164–87.

    Article  Google Scholar 

  27. Bielawska K, Turner JA. Trust and the behavioral economics of automatic enrollment in pensions: a comparison of the UK and Poland. Journal of Economic Policy Reform. 2003;26(2):216–37.

    Article  Google Scholar 

  28. Johnson EJ, Goldstein D. Do Defaults Save Lives? Science. 2003;302(5649):1338–9.

    Article  CAS  PubMed  Google Scholar 

  29. Sunstein C. Sludge Audits. Behavioural Public Policy. 2022;6(4):654–73.

    Article  Google Scholar 

  30. Stanovich KE. Why humans are cognitive misers and what it means for the Great Rationality Debate. In: Viale R, editor. Routledge Handbook of Bounded Rationality. Boca Raton: CRC Press; 2020.

    Google Scholar 

  31. Puzantian T, Gasper JJ. Provision of Naloxone Without a Prescription by California Pharmacists 2 Years After Legislation Implementation. JAMA. 2018;320(18):1933–4.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Halpern D, Inside the nudge unit. How small changes can make a big difference. London. UK: Penguin Random House UK; 2015.

    Google Scholar 

  33. Birken SA, et al. Organizational theory for dissemination and implementation research. Implement Sci. 2017;12(1):62.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Delgado MK, et al. Association between Electronic Medical Record Implementation of Default Opioid Prescription Quantities and Prescribing Behavior in Two Emergency Departments. J Gen Intern Med. 2018;33(4):409–11.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Michie S, van Stralen MM, West R. The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement Sci. 2011;6:42.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Brown B, et al. Clinical Performance Feedback Intervention Theory (CP-FIT): a new theory for designing, implementing, and evaluating feedback in health care based on a systematic review and meta-synthesis of qualitative research. Implement Sci. 2019;14(1):40.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Atkins L, et al. A guide to using the Theoretical Domains Framework of behaviour change to investigate implementation problems. Implement Sci. 2017;12(1):77.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Yoong SL, et al. Nudge strategies to improve healthcare providers’ implementation of evidence-based guidelines, policies and practices: a systematic review of trials included within Cochrane systematic reviews. Implement Sci. 2020;15(1):50.

    Article  PubMed  PubMed Central  Google Scholar 

  39. Quanbeck A, Hennessy RG, Park L. Applying concepts from “rapid” and “agile” implementation to advance implementation research. Implement Sci Commun. 2022;3(1):118.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Powell BJ, et al. Methods to Improve the Selection and Tailoring of Implementation Strategies. J Behav Health Serv Res. 2017;44(2):177–94.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Stewart RE, et al. Applying NUDGE to Inform Design of EBP Implementation Strategies in Community Mental Health Settings. Adm Policy Ment Health. 2021;48(1):131–42.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Service, O., Hallsworth, M., Halpern, D., Algate, F., Gallagher, R., Nguyen, S., Ruda, S., Sanders, M., EAST: Four simple ways to apply behavioural insights. 2014, The Behavioural Insights Team.

  43. Adams SA. Third ‘Do It for Babydog’ COVID-19 Vaccination Lottery Aimed at Children. In: The Intelligencer. Wheeling; 2021.

  44. Walker ET, et al. Patient-activist or ally? Assessing the effectiveness of conscience and beneficiary constituents in disease advocacy fundraising. Sociol Health Illn. 2023;45(8):1652–72.

    Article  PubMed  Google Scholar 

  45. Kahneman D, Klein G. Strategic decisions: when can you trust your gut? McKinsey Quarterly. 2010.

    Google Scholar 

  46. Klein G. Performing a project postmortem. Harvard Business Review. 2007.

    Google Scholar 

  47. Patel MS, et al. Generic Medication Prescription Rates After Health System-Wide Redesign of Default Options Within the Electronic Health Record. JAMA Intern Med. 2016;176(6):847–8.

    Article  PubMed  PubMed Central  Google Scholar 

  48. Wolfenden L, et al. Designing and undertaking randomised implementation trials: guide for researchers. BMJ. 2021;372: m3721.

    Article  PubMed  PubMed Central  Google Scholar 

  49. Purtle J, et al. Partisan differences in the effects of economic evidence and local data on legislator engagement with dissemination materials about behavioral health: a dissemination trial. Implement Sci. 2022;17(1):38.

    Article  PubMed  PubMed Central  Google Scholar 

  50. Kunda Z. The case for motivated reasoning. Psychol Bull. 1990;108(3):480–98.

    Article  CAS  PubMed  Google Scholar 

  51. Golman R, Hagmann D, Loewenstein G. Information Avoidance. Journal of Economic Literature. 2017;55(1):96–135.

    Article  Google Scholar 

  52. Glidewell L, et al. Explaining variable effects of an adaptable implementation package to promote evidence-based practice in primary care: a longitudinal process evaluation. Implement Sci. 2022;17(1):9.

    Article  PubMed  PubMed Central  Google Scholar 

  53. Camerer CF, Malmendier U. Behavioral economics of organizations, in Behavioral economics and its applications. P Diamond, Vartiainen H, Editor. Princeton University Press: Princeton; 2007:235–290.

  54. Nathan N, et al. Do the Expert Recommendations for Implementing Change (ERIC) strategies adequately address sustainment? Front Health Serv. 2022;2: 905909.

    Article  PubMed  PubMed Central  Google Scholar 

  55. Chater N, Loewenstein G. The i-frame and the s-frame: How focusing on individual-level solutions has led behavioral public policy astray. Behavioral and Brain Sciences. 2023;e147–p1-84.

  56. Hallsworth M. A manifesto for applying behavioural science. Nat Hum Behav. 2023;7:310–22.

    Article  PubMed  Google Scholar 

  57. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38(1):4–23.

    Article  PubMed  Google Scholar 

  58. Presseau J, et al. Action, actor, context, target, time (AACTT): a framework for specifying behaviour. Implement Sci. 2019;14(1):102.

    Article  PubMed  PubMed Central  Google Scholar 

  59. Presseau J, et al. Enhancing the translation of health behaviour change research into practice: a selective conceptual review of the synergy between implementation science and health psychology. Health Psychol Rev. 2022;16(1):22–49.

    Article  PubMed  Google Scholar 

  60. Pfadenhauer LM, et al. Making sense of complexity in context and implementation: the Context and Implementation of Complex Interventions (CICI) framework. Implement Sci. 2017;12(1):21.

    Article  PubMed  PubMed Central  Google Scholar 

  61. Squires JE, et al. Stakeholder Perspectives of Attributes and Features of Context Relevant to Knowledge Translation in Health Settings: A Multi-Country Analysis. Int J Health Policy Manag. 2021;11(8):1373–90.

    PubMed  PubMed Central  Google Scholar 

  62. Nilsen P, Bernhardsson S. Context matters in implementation science: a scoping review of determinant frameworks that describe contextual determinants for implementation outcomes. BMC Health Serv Res. 2019;19(1):189.

    Article  PubMed  PubMed Central  Google Scholar 

  63. Waltz TJ, et al. Choosing implementation strategies to address contextual barriers: diversity in recommendations and future directions. Implement Sci. 2019;14(1):42.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

Rinad Beidas is grateful to her colleagues at the Penn Center for Health Incentives and Behavioral Economics for their partnership in bringing behavioral economics to implementation science. She also specifically thanks Justin Bekelman, MD, Alison Buttenheim, PhD, David Mandell, ScD, Robert Schnoll, PhD, and Kevin Volpp, MD, PhD, for their partnership in co-leading two centers at the intersection of behavioral economics and implementation science which has contributed to her thinking in this sphere.

Funding

No funder had a role in the production of this manuscript but RB acknowledges P50CA244690 (Schnoll, Bekelman, Beidas). BJP was partially supported by the National Institutes of Health through R25MH080916, U24HL154426, R01CA262325, P50DA054072, P50CA19006, and P50CA244690. NH was partially supported by a National Institute for Health Research Academic Clinical Fellowship.

Author information

Authors and Affiliations

Authors

Contributions

NH led the writing of the first draft and RB contributed to writing specific sections. BP, PN and RB edited the text. All authors reviewed and approved the final version.

Corresponding author

Correspondence to Nathan Hodson.

Ethics declarations

Ethics approval and consent to participate

Not required.

Consent for publication

Not applicable.

Competing interests

NH and PN have no competing interests to declare.

BJP serves on the Editorial Board for Implementation Science and is also a Guest Editor for a special collection of Implementation Science and Implementation Science Communications that is titled, “Advancing Science and Practice through the Study of Implementation Mechanisms.” All editorial decisions regarding this manuscript were made by other Editors and/or Associate Editors.

RSB is principal at Implementation Science & Practice, LLC. She is currently an appointed member of the National Advisory Mental Health Council and the NASEM study, “Blueprint for a national prevention infrastructure for behavioral health disorders,” and serves on the scientific advisory board for AIM Youth Mental Health Foundation and the Klingenstein Third Generation Foundation. She has received consulting fees from United Behavioral Health and Optum-Labs. She previously served on the scientific and advisory board for Optum Behavioral Health and has received royalties from Oxford University Press. She serves as Associate Editor for Implementation Science and Implementation Science Communications. All editorial decisions regarding this manuscript were made by other Editors and/or Associate Editors.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hodson, N., Powell, B.J., Nilsen, P. et al. How can a behavioral economics lens contribute to implementation science?. Implementation Sci 19, 33 (2024). https://doi.org/10.1186/s13012-024-01362-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-024-01362-y

Keywords